Using Transactions

The SQLite PowerShell provider is transaction-aware. If the PowerShell session has an active transaction, the SQLite provider enlists the transaction for any database activity if you specify the -useTransaction parameter of the provider cmdlets.

For example, this code wraps the creation of 100 users in a transaction. If the transaction fails (e.g., if the computer crashes), none of the 100 user records will be added to the database:

PS> start-transaction
PS> 1..100 | new-item mydb:/Users -username { "UserID$_" } -usetransaction
PS> complete-transaction

Transactions can span multiple operations on the same database:

PS> start-transaction
PS> $users = get-childitems mydb:/Users -filter "status=1" -usetransaction
PS> $users | set-item -value @{ status=0, active=0 } -usetransaction
PS> remove-item -mydb:/Users -filter 'active=0' -usetransaction
PS> complete-transaction

Last edited Jun 25, 2012 at 5:06 PM by beefarino, version 3

Comments

beefarino Jan 3, 2013 at 3:04 PM 
Transactions aren't for speed, they're for grouping activity together in an atomic unit of work. Moreover, the transactions used here span the powershell process AND the SQLite database - that is, you can include other transactional powershell commands, such as the registry provider or a transacted string object in the same transaction. In that sense, they aren't really optimized for database activity.

Unfortunately speed is an issue here, but not because of the use or lack of transaction. It has to do with the way powershell interacts with the provider and the drive - in a nutshell, the sqlite provider must execute a db command each time new-item is called instead of, say, generating a batch of sql and executing at the end of the pipeline segment.

In order to provide an option for speedier batching, the provider supports invoke-item at the database and table nodes, allowing you to specify a manual sql string to execute against the database. Compare the results:

ipmo sqlite;
mount-sqlite -name db
new-item db:\test -testname text -testid integer primary key;
measure-command { 0..100 | new-item db:\test -testname 'asdf'; }
measure-command{ invoke-item db: -sql (( 1..100 | %{ "insert into test( testname) values( 'test$_' )" }) -join ';' ) }

I hope this helps, and I apologize for the delay. Somehow this comment slipped through the notification system...

joel2600 Jul 22, 2012 at 10:03 PM 
Hi, thanks for your work putting this together, but I'm confused about transactions.

Isn't the entire point of transactions to speed up database operations?

I seem to get roughly the same amount of inserts using transactions as without.

I can insert about 70 records a minute using transactions, and 67 records a minute when not using transactions.

I get the same results even using your first example code above. Is this typical, or should I be seeing better results?