DBIx-Class-ResultSet-RecursiveUpdate-0.40/0000775000175000017500000000000013556035664020503 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/README0000644000175000017500000000066713556035664021372 0ustar ahartmaiahartmaiThis archive contains the distribution DBIx-Class-ResultSet-RecursiveUpdate, version 0.40: like update_or_create - but recursive This software is copyright (c) 2019 by Zbigniew Lukasiak, John Napiorkowski, Alexander Hartmaier. This is free software; you can redistribute it and/or modify it under the same terms as the Perl 5 programming language system itself. This README file was generated by Dist::Zilla::Plugin::Readme v6.012. DBIx-Class-ResultSet-RecursiveUpdate-0.40/Changes0000644000175000017500000000716713556035664022007 0ustar ahartmaiahartmaiRevision history for DBIx::Class::ResultSet::RecursiveUpdate 0.40 2019-10-29 14:16:34+01:00 Europe/Vienna - Fix missing Try::Tiny dependency - Don't try to find a row by primary key if not all primary key columns are defined - Reduce select queries for has_many relationships to one per non-prefetched or none for prefetched has_many relationships down from one per related row even for prefetched relationships - Don't execute a delete/update query for has_many relationships if no rows need to be deleted or unlinked by setting their foreign key column(s) to NULL - Preserve prefetched related resultsets on row update - Stable update order by sorting all columns and relationships 0.34 2014-02-06 23:37:32 America/New_York - More changes to support custom CODE relationship condition 0.33 2014-02-06 14:58:54 America/New_York - Don't die on custom CODE relationship condition 0.32 2014-01-20 19:15:29+01:00 Europe/Vienna - 'id' can't be used any more as an alias for the primary key column name as DBIx::Class doesn't treat it special - Try to construct a new row object with all given update attributes and use it to find the row in the database 0.31 Wed Nov 6, 2013 - Fixed failing test when DBIC_TRACE_PROFILE is set - Fix problem with join_type LEFT and undef (rt67528) - discard_changes before handling post_updates 0.30 Fri Jun 7, 2013 - Update foreign key instead of related object when all PKs are set on a relationship with accessor single or filter 0.29 Thu May 2, 2013 - Remove DBIx::Class::InflateColumn::FS dependency 0.28 Wed Apr 3, 2013 - Don't delete and re-add all many-to-many rows. Transform m2m data to recursive has_many data if IntrospectableM2M is loaded. 0.27 Tues Feb 26, 2013 - Do an update on the object when there are 'other_updates' in addition to when the row 'is_changed' because of possible custom update methods 0.26 Wed Nov 28, 2012 - Fix multi-pk has_many bug - Fix has_many with where conditions 0.25 Thu Apr 12, 2012 - Suppress DBIC warnings: NULL/undef values supplied for requested unique constraint 'primary'. 0.24 2011-05-16 15:34:10 America/New_York - Fixed test case that was failing on newer versions of DBIC, which is more strict when inspecting relationship join conditions. You will need this when you want to upgrade DBIC. 0.23 2011-02-24 18:23:50 Europe/Vienna - Fixed moosified-rs.t failures by making the test skip if not all dependencies are met. Requiring Moose for a compatability test would have been overkill. (thanks CPANTesters & RT#65959) 0.22 2011-02-09 19:06:34 Europe/Vienna - Fixed updating of nullable has_many rels (RT#65561) - Fixed usage with moosified resultsets (RT#64773) 0.21 2010-10-28 16:56:18 Europe/Vienna - Warn instead of throwing an exception if a key is neither a column, a relationship nor a many-to-many helper. - More documentation improvements 0.20 2010-10-19 09:25:33 Europe/Vienna - Support has_many relationships with multi-column primary keys 0.013 Thu Apr 08 15:37:13 UTC 2010 - Allow might_have relationships to be empty 0.012 Thu Sep 10 19:44:25 CEST 2009 - updating records linked to by many to many 0.009 Sat Jun 20 16:37:57 CEST 2009 - if_not_submitted flag (experimental) 0.006 Fri May 15 11:03:48 CEST 2009 - Some adjustments for HTML::FormHandler 0.004 Sun Apr 19 11:15:57 CEST 2009 - Added functional interface - for easy use in Form Processors 0.001 Wed Jun 18 13:09:28 2008 - Initial release. DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/0000775000175000017500000000000013556035664020746 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/t/pod.t0000644000175000017500000000023413556035664021712 0ustar ahartmaiahartmaiuse strict; use warnings; use Test::More; eval "use Test::Pod 1.14"; plan skip_all => "Test::Pod 1.14 required for testing POD" if $@; all_pod_files_ok(); DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/00load.t0000644000175000017500000000017513556035664022213 0ustar ahartmaiahartmaiuse strict; use warnings; use Test::More; BEGIN { use_ok('DBIx::Class::ResultSet::RecursiveUpdate'); } done_testing(); DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/01_basic.t0000644000175000017500000003123613556035664022517 0ustar ahartmaiahartmaiuse strict; use warnings; use Test::More; use Test::Warn; use Test::Trap; use DBIx::Class::ResultSet::RecursiveUpdate; use lib 't/lib'; use DBSchema; my $schema = DBSchema->get_test_schema(); # moosified tests #use DBSchemaMoose; #my $schema = DBSchemaMoose->get_test_schema('dbi:SQLite:dbname=:memory:'); # pg tests #my ( $dsn, $user, $pass ) = @ENV{ map {"DBICTEST_PG_${_}"} qw/DSN USER PASS/ }; #my $schema = DBSchema->get_test_schema( $dsn, $user, $pass ); my $dvd_rs = $schema->resultset('Dvd'); my $user_rs = $schema->resultset('User'); my $owner = $user_rs->next; my $another_owner = $user_rs->next; my $initial_user_count = $user_rs->count; my $expected_user_count = $initial_user_count; my $initial_dvd_count = $dvd_rs->count; my $updates; # pre 0.21 api $dvd_rs->search( { dvd_id => 1 } ) ->recursive_update( { owner => { username => 'aaa' } }, ['dvd_id'] ); my $u = $user_rs->find( $dvd_rs->find(1)->owner->id ); is( $u->username, 'aaa', 'fixed_fields pre 0.21 api ok' ); # 0.21+ api $dvd_rs->search( { dvd_id => 1 } ) ->recursive_update( { owner => { username => 'bbb' } }, { fixed_fields => ['dvd_id'], } ); $u = $user_rs->find( $dvd_rs->find(1)->owner->id ); is( $u->username, 'bbb', 'fixed_fields 0.21+ api ok' ); { # try to create with a not existing rel my $updates = { name => 'Test for nonexisting rel', username => 'nonexisting_rel', password => 'whatever', nonexisting => { foo => 'bar' }, }; warning_like { my $user = $user_rs->recursive_update($updates); } qr/No such column, relationship, many-to-many helper accessor or generic accessor 'nonexisting'/, 'nonexisting column, accessor, relationship warns'; $expected_user_count++; is( $user_rs->count, $expected_user_count, 'User created' ); } { my $debug = $user_rs->result_source->storage->debug; $user_rs->result_source->storage->debug(0); # try to create with a not existing rel but suppressed warning my $updates = { name => 'Test for nonexisting rel with suppressed warning', username => 'suppressed_nonexisting_rel', password => 'whatever', nonexisting => { foo => 'bar' }, }; warning_is { my $user = $user_rs->recursive_update( $updates, { unknown_params_ok => 1 } ); } "", "nonexisting column, accessor, relationship doesn't warn with unknown_params_ok"; $expected_user_count++; $user_rs->result_source->storage->debug($debug); is( $user_rs->count, $expected_user_count, 'User created' ); } { # try to create with a not existing rel, suppressed warning but storage debugging my $updates = { name => 'Test for nonexisting rel with suppressed warning but storage debugging', username => 'suppressed_nonexisting_rel_with_storage_debug', password => 'whatever', nonexisting => { foo => 'bar' }, }; my $debug = $user_rs->result_source->storage->debug; $user_rs->result_source->storage->debug(1); my $user; my @r = trap { $user = $user_rs->recursive_update( $updates, { unknown_params_ok => 1 } ); }; like( $trap->stderr, qr/No such column, relationship, many-to-many helper accessor or generic accessor 'nonexisting'/, "nonexisting column, accessor, relationship doesn't warn with unknown_params_ok" ); $expected_user_count++; is( $user_rs->count, $expected_user_count, 'User created' ); $user_rs->result_source->storage->debug($debug); } # creating new record linked to some old record $updates = { name => 'Test name 2', viewings => [ { user_id => $owner->id } ], owner => { id => $another_owner->id }, }; my $new_dvd = $dvd_rs->recursive_update($updates); is( $dvd_rs->count, $initial_dvd_count + 1, 'Dvd created' ); is( $schema->resultset('User')->count, $expected_user_count, "No new user created" ); is( $new_dvd->name, 'Test name 2', 'Dvd name set' ); is( $new_dvd->owner->id, $another_owner->id, 'Owner set' ); is( $new_dvd->viewings->count, 1, 'Viewing created' ); # creating new records $updates = { tags => [ '2', { id => '3' } ], name => 'Test name', owner => $owner, current_borrower => { name => 'temp name', username => 'temp name', password => 'temp name', }, liner_notes => { notes => 'test note', }, like_has_many => [ { key2 => 1 } ], like_has_many2 => [ { onekey => { name => 'aaaaa' }, key2 => 1 } ], }; my $dvd = $dvd_rs->recursive_update($updates); $expected_user_count++; is( $dvd_rs->count, $initial_dvd_count + 2, 'Dvd created' ); is( $schema->resultset('User')->count, $expected_user_count, "One new user created" ); is( $dvd->name, 'Test name', 'Dvd name set' ); is_deeply( [ map { $_->id } $dvd->tags ], [ '2', '3' ], 'Tags set' ); is( $dvd->owner->id, $owner->id, 'Owner set' ); is( $dvd->current_borrower->name, 'temp name', 'Related record created' ); is( $dvd->liner_notes->notes, 'test note', 'might_have record created' ); ok( $schema->resultset('Twokeys') ->find( { dvd_name => 'Test name', key2 => 1 } ), 'Twokeys created' ); my $onekey = $schema->resultset('Onekey')->search( { name => 'aaaaa' } )->first; ok( $onekey, 'Onekey created' ); ok( $schema->resultset('Twokeys_belongsto') ->find( { key1 => $onekey->id, key2 => 1 } ), 'Twokeys_belongsto created' ); TODO: { local $TODO = 'value of fk from a multi relationship'; is( $dvd->twokeysfk, $onekey->id, 'twokeysfk in Dvd' ); } is( $dvd->name, 'Test name', 'Dvd name set' ); # changing existing records my $num_of_users = $user_rs->count; $updates = { dvd_id => $dvd->dvd_id, name => undef, tags => [], owner => $another_owner->id, current_borrower => { username => 'new name a', name => 'new name a', password => 'new password a', }, liner_notes => { notes => 'test note changed', }, }; my $dvd_updated = $dvd_rs->recursive_update($updates); is( $dvd_updated->dvd_id, $dvd->dvd_id, 'Pk from "dvd_id"' ); is( $schema->resultset('User')->count, $expected_user_count, "No new user created" ); is( $dvd_updated->name, undef, 'Dvd name deleted' ); is( $dvd_updated->get_column('owner'), $another_owner->id, 'Owner updated' ); is( $dvd_updated->current_borrower->name, 'new name a', 'Related record modified' ); is( $dvd_updated->tags->count, 0, 'Tags deleted' ); is( $dvd_updated->liner_notes->notes, 'test note changed', 'might_have record changed' ); my $dvd_with_tags = $dvd_rs->recursive_update( { dvd_id => $dvd->dvd_id, tags => [ 1, 2 ] } ); is_deeply( [ map { $_->id } $dvd_with_tags->tags ], [ 1, 2 ], 'Tags set' ); my $dvd_without_tags = $dvd_rs->recursive_update( { dvd_id => $dvd->dvd_id, tags => undef } ); is( $dvd_without_tags->tags->count, 0, 'Tags deleted when m2m accessor set to undef' ); # this test passes a new value for the pk column dvd_name in the updates hash # and expects that the resolved dvd_name is used for the update $new_dvd->update( { name => 'New Test Name' } ); $updates = { dvd_id => $new_dvd->dvd_id, like_has_many => [ { dvd_name => $dvd->name, key2 => 1 } ], }; $dvd_updated = $dvd_rs->recursive_update($updates); ok( $schema->resultset('Twokeys') ->find( { dvd_name => 'New Test Name', key2 => 1 } ), 'Twokeys updated' ); ok( !$schema->resultset('Twokeys') ->find( { dvd_name => $dvd->name, key2 => 1 } ), 'Twokeys updated' ); # repeatable $updates = { name => 'temp name', username => 'temp username', password => 'temp username', owned_dvds => [ { 'name' => 'temp name 1', 'tags' => [ 1, 2 ], }, { 'name' => 'temp name 2', 'tags' => [ 2, 3 ], } ] }; my $user = $user_rs->recursive_update($updates); $expected_user_count++; is( $schema->resultset('User')->count, $expected_user_count, "New user created" ); is( $dvd_rs->count, $initial_dvd_count + 4, 'Dvds created' ); my %owned_dvds = map { $_->name => $_ } $user->owned_dvds; is( scalar keys %owned_dvds, 2, 'Has many relations created' ); ok( $owned_dvds{'temp name 1'}, 'Name in a has_many related record saved' ); my @tags = $owned_dvds{'temp name 1'}->tags; is( scalar @tags, 2, 'Tags in has_many related record saved' ); ok( $owned_dvds{'temp name 2'}, 'Second name in a has_many related record saved' ); # update has_many where foreign cols aren't nullable $updates = { id => $user->id, address => { street => "101 Main Street", city => "Podunk", state => "New York" }, owned_dvds => [ { dvd_id => 1, }, ] }; $user = $user_rs->recursive_update($updates); is( $schema->resultset('Address')->search( { user_id => $user->id } )->count, 1, 'the right number of addresses' ); $dvd = $dvd_rs->find(1); is( $dvd->get_column('owner'), $user->id, 'foreign key set' ); # has_many where foreign cols are nullable my $available_dvd_rs = $dvd_rs->search( { current_borrower => undef } ); $dvd_rs->update( { current_borrower => $user->id } ); ok( $user->borrowed_dvds->count > 1, 'Precond' ); $updates = { id => $user->id, borrowed_dvds => [ { dvd_id => $dvd->id }, ] }; $user = DBIx::Class::ResultSet::RecursiveUpdate::Functions::recursive_update( resultset => $user_rs, updates => $updates, if_not_submitted => 'set_to_null', ); is( $user->borrowed_dvds->count, 1, 'borrowed_dvds update with if_not_submitted => set_to_null ok' ); is( $available_dvd_rs->count, 5, "previously borrowed dvds weren't deleted" ); $dvd_rs->update( { current_borrower => $user->id } ); $user = DBIx::Class::ResultSet::RecursiveUpdate::Functions::recursive_update( resultset => $user_rs, updates => $updates, ); is( $user->borrowed_dvds->count, 1, 'borrowed_dvds update without if_not_submitted ok' ); is( $available_dvd_rs->count, 5, "previously borrowed dvds weren't deleted" ); $dvd_rs->update( { current_borrower => $user->id } ); $user = DBIx::Class::ResultSet::RecursiveUpdate::Functions::recursive_update( resultset => $user_rs, updates => $updates, if_not_submitted => 'delete', ); is( $user->borrowed_dvds->count, 1, 'borrowed_dvds update with if_not_submitted => delete ok' ); is( $dvd_rs->count, 1, 'all dvds except the one borrowed by the user were deleted' ); @tags = $schema->resultset('Tag')->all; $dvd_updated = DBIx::Class::ResultSet::RecursiveUpdate::Functions::recursive_update( resultset => $schema->resultset('Dvd'), updates => { dvd_id => $dvd->dvd_id, tags => [ { id => $tags[0]->id, file => 'file0' }, { id => $tags[1]->id, file => 'file1' } ], } ); $tags[$_]->discard_changes for 0 .. 1; is( $tags[0]->file, 'file0', 'file set in tag' ); is( $tags[1]->file, 'file1', 'file set in tag' ); my @rel_tags = $dvd_updated->tags; is( scalar @rel_tags, 2, 'tags related' ); ok( $rel_tags[0]->file eq 'file0' || $rel_tags[0]->file eq 'file1', 'tags related' ); my $new_person = { name => 'Amiri Barksdale', username => 'amiri', password => 'amiri', }; ok( my $new_user = $user_rs->recursive_update($new_person) ); # delete has_many where foreign cols aren't nullable my $rs_user_dvd = $user->owned_dvds; my @user_dvd_ids = map { $_->dvd_id } $rs_user_dvd->all; is( $rs_user_dvd->count, 1, 'user owns 1 dvd' ); $updates = { id => $user->id, owned_dvds => undef, }; $user = $user_rs->recursive_update($updates); is( $user->owned_dvds->count, 0, 'user owns no dvds' ); is( $dvd_rs->search( { dvd_id => { -in => \@user_dvd_ids } } )->count, 0, 'owned dvds deleted' ); ok (my $dvd_with_keysbymethod = $dvd_rs->create({ name => 'DVD with keys by method relationship', owner => $user }), 'dvd for test created'); $dvd_with_keysbymethod->add_to_keysbymethod({ key1 => 'foo', key2 => 'bar', }); is($dvd_with_keysbymethod->keysbymethod->first->combined_key, 'foo/bar', 'combined_key method returns correct value'); ok( my $dvd_with_keysbymethod_updated = $dvd_rs->recursive_update({ dvd_id => $dvd_with_keysbymethod->id, keysbymethod => [{ combined_key => 'foo/bar', value => 'baz', }] }), 'updating keysbymethod relationship ok' ); my $keysbymethod_rs = $schema->resultset('KeysByMethod'); ok( my $keysbymethod = $keysbymethod_rs->recursive_update({ dvd => $dvd_with_keysbymethod->id, combined_key => 'foo/bar', value => 'new-value', }), 'updating keysbymethod ok' ); is($keysbymethod->value, 'new-value', 'value changed'); done_testing; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/02_cache.t0000644000175000017500000000776413556035664022513 0ustar ahartmaiahartmaiuse strict; use warnings; use Test::More; use Test::DBIC::ExpectedQueries; use DBIx::Class::ResultSet::RecursiveUpdate; use lib 't/lib'; use DBSchema; my $schema = DBSchema->get_test_schema(); my $queries = Test::DBIC::ExpectedQueries->new({ schema => $schema }); my $rs_users = $schema->resultset('User'); my $user = $rs_users->create({ name => 'cache name', username => 'cache username', password => 'cache username', }); my $dvd = $schema->resultset('Dvd')->create({ name => 'existing DVD', owner => $user->id, dvdtags => [{ tag => 1, }, { tag => { name => 'crime' }, }], }); my $rs_users_without_cache = $rs_users->search_rs({ 'me.id' => $user->id }); $queries->run(sub { $rs_users_without_cache->recursive_update({ id => $user->id, name => 'updated name', }); }); $queries->test({ usr => { select => 1, update => 1, }, }, 'expected queries without cache'); my $rs_users_with_cache = $rs_users->search_rs({ 'me.id' => $user->id }, { cache => 1, }); diag("populate cache"); $rs_users_with_cache->all; $queries->run(sub { $rs_users_with_cache->recursive_update({ id => $user->id, name => 'updated name 2', }); }); $queries->test({ usr => { update => 1, }, }, 'expected queries with cache'); # test related rows cache not used after update $rs_users_with_cache = $rs_users->search_rs({ 'me.id' => $user->id }, { prefetch => 'owned_dvds', cache => 1, }); diag("populate cache"); $rs_users_with_cache->all; $queries->run(sub { $rs_users_with_cache->recursive_update({ id => $user->id, name => 'cache name updated', owned_dvds => [ { dvd_id => 5, } ], }); }); $queries->test({ usr => { update => 1, }, }, 'expected queries with unchanged has_many relationship and cache'); $rs_users_with_cache = $rs_users->search_rs({ 'me.id' => $user->id }, { prefetch => { owned_dvds => { 'dvdtags' => 'tag' } }, cache => 1, }); diag("populate cache"); $rs_users_with_cache->all; $queries->run(sub { $rs_users_with_cache->recursive_update({ id => $user->id, owned_dvds => [ { dvd_id => $dvd->id, name => 'existing DVD', }, { name => 'new DVD', } ] }); }); $queries->test({ dvd => { insert => 1, # one by the discard_changes call for created rows select => 1, }, }, 'expected queries with has_many relationship and cache'); $rs_users_with_cache = $rs_users->search_rs({ 'me.id' => $user->id }, { prefetch => { owned_dvds => { 'dvdtags' => 'tag' } }, cache => 1, }); diag("populate cache"); $rs_users_with_cache->all; ok (my $new_dvd = $user->owned_dvds->find({ name => 'new DVD'}), 'new DVD found'); $queries->run(sub { $rs_users_with_cache->recursive_update({ id => $user->id, owned_dvds => [ { dvd_id => $dvd->id, tags => [ 1, 3 ], }, { dvd_id => $new_dvd->id, tags => [ 2, 3 ], } ] }); }); $queries->test({ dvdtag => { # one for tag 3 of 'existing DVD' # two for tags 2 and 3 of 'new DVD' insert => 3, # one for the find of existing tag 3 of 'existing DVD' # one from the discard_changes call for created tag 3 of 'existing DVD' # two for the find of the two existing tags of 'new DVD' # two from the discard_changes call for created tags of 'new DVD' select => 6, # this is the cleanup query which deletes all tags of a dvd not # passed to tags, in this case the 'crime' tag created above delete => 1, }, }, 'expected queries with many_to_many relationship helper and cache'); done_testing; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/undef_pk.t0000644000175000017500000000210013556035664022715 0ustar ahartmaiahartmaiuse strict; use warnings; use Test::More; use DBIx::Class::ResultSet::RecursiveUpdate; use lib 't/lib'; use DBSchema; my $schema = DBSchema->get_test_schema(); # OK, create podcast that belongs_to owner my $podcast = $schema->resultset('Podcast')->create({ title => 'Pirates of the Caribbean', owner => {name => 'Bob'} }); is( $podcast->title, 'Pirates of the Caribbean', 'podcast name is correct'); is( $podcast->owner->name, 'Bob', 'owner is correct' ); my $owner = $podcast->owner; # FAIL: trying to update podcast: set owner to NULL DBIx::Class::ResultSet::RecursiveUpdate::Functions::recursive_update( resultset => $schema->resultset('Podcast'), updates => { title => 'Pirates of the Caribbean II', owner => undef }, object => $podcast ); $podcast->discard_changes; # OK, title updated correctly is( $podcast->title, 'Pirates of the Caribbean II', 'podcast name is correct'); ok( ! $podcast->owner, 'no podcast owner'); # clear db $podcast->delete; $owner->delete; done_testing; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/var/0000775000175000017500000000000013556035664021536 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/t/var/dvdzbr.db0000644000175000017500000012600013556035664023335 0ustar ahartmaiahartmaiSQLite format 3@ ¡'+kU¡'.8Ö+ùòäëÝւƒytableaddressaddressCREATE TABLE address ( address_id INTEGER PRIMARY KEY NOT NULL, user_id INTEGER NOT NULL, street VARCHAR(32) NOT NULL, city VARCHAR(32) NOT NULL, state VARCHAR(32) NOT NULL, FOREIGN KEY (user_id) REFERENCES usr(id) ON DELETE CASCADE )+‚=tableusrusrCREATE TABLE usr ( id INTEGER PRIMARY KEY NOT NULL, username varchar(100) NOT NULL, password varchar(100) NOT NULL, name varchar(100) NOT NULL )k=tabletagtagCREATE TABLE tag ( id INTEGER PRIMARY KEY NOT NULL, name varchar(100), file text )j7tableroleroleCREATE TABLE role ( id INTEGER PRIMARY KEY NOT NULL, role varchar(100) NOT NULL )f##tablepersonalitypersonalityCREATE TABLE personality ( user_id INTEGER PRIMARY KEY NOT NULL )g)tableonekeyonekeyCREATE TABLE onekey ( id INTEó *€)€ '€€ £ £{  ôüøô  ØôèØ! australian  dramat  comedy ÃêÝà -zbyZbyszek Lukasiak  isaIsa #jgdaJonas Alves   Uvêv–³Qãtva'AO//ƒGtable¢_%%ƒutablekeysbymethodkڕ1sindexuser_role_idx_useruser_rolep£v?tableownerowner$CREATE TABLE owner ( id INTEGER PRIMARY KEY NOT NULL, name text NOT NULL DEFAULT '' )g£u)tableonekeyonekeyCREATE TABLE onekey ( id INTEGER PRIMARY KEY NOT NULL, name varchar(100) )A/indextwokeys_belongsk£{=tabletagtagCREATE TABLE tag ( id INTEGER PRIMARY KEY NOT NULL, name varchar(100), file text )j£z7tableroleroleCREATE TABLE role ( id INTEGER PRIMARY KEY NOT NULL, role varchar(100) NOT NULL )^£y5{indexpodcast_idx_owner_idpodcast&CREATE INDEX podcast_idx_owner_id ON podcast (owner_id)^£xƒtablepodcastpodcast%CREATE TABLE podcast ( id INTEGER PRIMARY KEY NOT NULL, title text NOT NULL DEFAULT '', owner_id integer, FOREIGN KEY (owner_id) REFERENCES owner(id) ON DELETE CASCADE ON UPDATE CASCADE )f£w##tablepersonalitypersonalityCREATE TABLE personality ( user_id INTEGER PRIMARY KEY NOT NULL ) vÀ£v  Seksmisja1236  Rejs1235 ) The Deerhunter1234> I 3Picnick under the Hanging Rock1232003-01-16 23:12:01 9P:Û9‡5{indextwokeys_idx_ƒ£†!tabledvddvd CREATE TABLE dvd ( dvd_id INTEGER PRIMARY KEY NOT NULL, name varchar(100), imdb_id varchar(100), owner integer NOT NULL, current_borrower integer, creation_date datetime, alter_date datetime, twokeysfk integer, FOREIGN KEY (current_borrower) REFERENCES usr(id) ON DELETE CASCADE ON UPDATE CASCADE, FOREIGN KEY (owner) REFERENCES usr(id) ON DELETE CASCADE ON UPDATE CASCADE )[£~3windexaddress_idx_user_idaddress"CREATE INDEX address_idx_user_id ON address (user_id)‚£}ƒytableaddressaddressCREATE TABLE address ( address_id INTEGER PRIMARY KEY NOT NULL, user_id INTEGER NOT NULL, street VARCHAR(32) NOT NULL, city VARCHAR(32) NOT NULL, state VARCHAR(32) NOT NULL, FOREIGN KEY (user_id) REFERENCES usr(id) ON DELETE CASCADE )+£|‚=tableusrusrCREATE TABLE usr ( id INTEGER PRIMARY KEY NOT NULL, username varchar(100) NOT NULL, password varchar(100) NOT NULL, name varchar(100) NOT NULL ) ìöñìû                 ëü÷ñë  â§@r±€ƒ^€5{indextwokeys_idx_dvd_nametwokeys CREATE INDEX twokeys_idx_dvd_name ON twokeys (dvd_name)-€Aindexsqlite_autoindex_twokeys_1twokeys<€‚Otabletwokeystwokeys CREATE TABLE twokeys ( dvd_name varchar(100) NOT NULL, key2 integer NOT NULL, PRIMARY KEY (dvd_name, key2), FOREIGN KEY (dvd_name) REFERENCES dvd(name) )I€##‚Ytableliner_notesliner_notesCREATE TABLE liner_notes ( liner_id INTEGER PRIMARY KEY NOT NULL, notes varchar(100) NOT NULL, FOREIGN KEY (liner_id) REFERENCES dvd(dvd_id) ON DELETE CASCADE )c€5%{indexkeysbymethod_idx_dvdkeysbymethod(CREATE INDEX keysbymethod_idx_dvd ON keysbymethod (dvd)7€K%indexsqlite_autoindex_keysbymethod_1keysbymethod‚€%%ƒutablekeysbymethodkeysbymethodCREATE TABLE keysbymethod ( dvd integer NOT NULL, key1 varchar(16) NOT NULL, key2 varchar(16) NOT NULL, value varchar(16), PRIMARY KEY (dvd, key1, key2), FOREIGN KEY (dvd) REFERENCES dvd(dvd_id) ON DELETE CASCADE ON UPDATE CASCADE )  èúôîè     åúóìå      ëü÷ñë  ‚£n„tabledvdtagdvdtagCREATE TABLE dvdtag ( dvd integer NOT NULL, tag integer NOT NULL, PRIMARY KEY (dvd, tag), FOREIGN KEY (dvd) REFERENCES dvd(dvd_id) ON DELETE CASCADE ON UPDATE CASCADE, FOREIGN KEY (tag) REFERENCES tag(id) ON DELETE CASCADE ON UPDATE CASCADE )[£m3windexviewing_idx_user_idviewing!CREATE INDEX viewing_idx_user_id ON viewing (user_id)X£l1sindexviewing_idx_dvd_idviewingCREATE INDEX viewing_idx_dvd_id ON viewing (dvd_id)-£kAindexsqlite_autoindex_viewing_1viewing‚ £jƒitableviewingviewingCREATE TABLE viewing ( user_id integer NOT NULL, dvd_id integer NOT NULL, PRIMARY KEY (user_id, dvd_id), FOREIGN KEY (dvd_id) REFERENCES dvd(dvd_id) ON DELETE CASCADE ON UPDATE CASCADE, FOREIGN KEY (user_id) REFERENCES usr(id) )Z£i1sindexuser_role_idx_useruser_roleCREATE INDEX user_role_idx_user ON user_role (user)Z£h1sindexuser_role_idx_roleuser_roleCREATE INDEX user_role_idx_role ON user_role (role)   Q7£`K%indexsqlite_autoindex_keysbymethod_1keysbymethod‚£_%%ƒutablekeysbymethodkeysbymethodCREATE TABLE keysbymethod ( dvd integer NOT NULL, key1 varchar(16) NOT NULL, key2 varchar(16) NOT NULL, value varchar(16), PRIMARY KEY (dvd, key1, key2), FOREIGN KEY (dvd) REFERENCES dvd(dvd_id) ON DELETE CASCADE ON UPDATE CASCADE )E£^'_indexdvd_idx_ownerdvdCREATE INDEX dvd_idx_owner ON dvd (owner)g£]= indexdvd_idx_current_borrowerdvd CREATE INDEX dvd_idx_current_borrower ON dvd (current_borrower)ƒ£\†!tabledvddvd CREATE TABLE dvd ( dvd_id INTEGER PRIMARY KEY NOT NULL, name varchar(100), imdb_id varchar(100), owner integer NOT NULL, current_borrower integer, creation_date datetime, alter_date datetime, twokeysfk integer, FOREIGN KEY (current_borrower) REFERENCES usr(id) ON DELETE CASCADE ON UPDATE CASCADE, FOREIGN KEY (owner) REFERENCES usr(id) ON DELETE CASCADE ON UPDATE CASCADE ) ëðëûõ    øBob ß!CPirates of the Caribbean II ü ~5•5++•SE€'_indexdvd_idx_ownerdvdCREATE INDEX dvd_idx_owner ON dvd (owner)†!tabledvddvd CREATE TABLE dvd ( dvd_id INTEGER PRIMARY KEY NOT NULL, name varchar(100), imdb_id varchar(100), owner integer NOT NULL, current_borrower integer, creation_date datetime, alter_date datetime, twokeysfk integer, FOREIGN KEY (current_borrower) REFERENCES usr(id) ON DELETE CASCADE ON UPDATE CASCADE, FOREIGN KEY (owner) REFERENCES usr(id) ON DELETE CASCADE ON UPDATE CASCADE )[£~3windexaddress_idx_user_idaddress"CREATE INDEX address_idx_user_id ON address (user_id)‚£}ƒytableaddressaddressCREATE TABLE address ( address_id INTEGER PRIMARY KEY NOT NULL, user_id INTEGER NOT NULL, street VARCHAR(32) NOT NULL, city VARCHAR(32) NOT NULL, state VARCHAR(32) NOT NULL, FOREIGN KEY (user_id) REFERENCES usr(id) ON DELETE CASCADE )g€= indexdvd_idx_current_borrowerdvd CREATE INDEX dvd_idx_current_borrower ON dvd (current_borrower) 'i1£gEindexsqlite_autoindex_user_role_1user_role‚!£f„tableuser_roleuser_roleCREATE TABLE user_role ( user integer NOT NULL, role integer NOT NULL, PRIMARY KEY (user, role), FOREIGN KEY (role) REFERENCES role(id) ON DELETE CASCADE ON UPDATE CASCADE, FOREIGN KEY (user) REFERENCES usr(id) ON DELETE CASCADE ON UPDATE CASCADE )^£e5{indextwokeys_idx_dvd_nametwokeys CREATE INDEX twokeys_idx_dvd_name ON twokeys (dvd_name)-£dAindexsqlite_autoindex_twokeys_1twokeysöc‚Otabletwokeystwokeys CREATE TABLE twokeys ( dvd_name varchar(100) NOT NULL, key2 integer NOT NULL, PRIMARY KEY (dvd_name, key2), FOREIGN KEY (dvd_name) REFERENCES dvd(name) )I£b##‚Ytableliner_notesliner_notesCREATE TABLE liner_notes ( liner_id INTEGER PRIMARY KEY NOT NULL, notes varchar(100) NOT NULL, FOREIGN KEY (liner_id) REFERENCES dvd(dvd_id) ON DELETE CASCADE )c£a5%{indexkeysbymethod_idx_dvdkeysbymethod'CREATE INDEX keysbymethod_idx_dvd ON keysbymethod (dvd) Û—œ9ÛòªN1sindexviewing_idx_dvd_idviewingCREATE INDEX viewing_idx_dvd_id ON viewing (dvd_id)-€Aindexsqlite_autoindex_viewing_1viewingZ€ 1sindexuser_role_idx_useruser_roleCREATE INDEX user_role_idx_user ON user_role (user)Z€ 1sindexuser_role_idx_roleuser_roleCREATE INDEX user_role_idx_role ON user_role (role)‚!€ „tableuser_roleuser_roleCREATE TABLE user_role ( user integer NOT NULL, role integer NOT NULL, PRIMARY KEY (user, role), FOREIGN KEY (role) REFERENCES role(id) ON DELETE CASCADE ON UPDATE CASCADE, FOREIGN KEY (user) REFERENCES usr(id) ON DELETE CASCADE ON UPDATE CASCADE )1€ Eindexsqlite_autoindex_user_role_1user_role‚ € ƒitableviewingviewingCREATE TABLE viewing ( user_id integer NOT NULL, dvd_id integer NOT NULL, PRIMARY KEY (user_id, dvd_id), FOREIGN KEY (dvd_id) REFERENCES dvd(dvd_id) ON DELETE CASCADE ON UPDATE CASCADE, FOREIGN KEY (user_id) REFERENCES usr(id) ) eÏsœ×n±K€)cindexdvdtag_idx_tagdvdtag#CREATE INDEX dvdtag_idx_tag ON dvdtag (tag)K€)cindexdvdtag_idx_dvddvdtagCREATE INDEX dvdtag_idx_dvd ON dvdtag (dvd)‚€„tabledvdtagdvdtagCREATE TABLE dvdtag ( dvd integer NOT NULL, tag integer NOT NULL, PRIMARY KEY (dvd, tag), FOREIGN KEY (dvd) REFERENCES dvd(dvd_id) ON DELETE CASCADE ON UPDATE CASCADE, FOREIGN KEY (tag) REFERENCES tag(id) ON DELETE CASCADE ON UPDATE CASCADE )+€?indexsqlite_autoindex_dvdtag_1dvdtag[€3windexviewing_idx_user_idviewing!CREATE INDEX viewing_idx_user_id ON viewing (user_id)1sindexviewing_idx_dvd_idviewingCREATE INDEX viewi‚ €//ƒGtabletwokeys_belongstotwokeys_belongstoCREATE TABLE twokeys_belongsto ( key1 integer NOT NULL, key2 integer NOT NULL, PRIMARY KEY (key1, key2), FOREIGN KEY (key1) REFERENCES dvd(twokeysfk), X€1sindexviewing_idx_dvd_idviewingCREATE INDEX viewing_idx_dvd_id ON viewing (dvd_id)-€Aindexsqlite_autoindex_viewing_1viewing +ïª+ +£|‚=tableusrusrCREATE TABLE usr ( id INTEGER PRIMARY KEY NOT NULL, username varchar(100) NOT NULL, password varchar(100) NOT NULL, name varchar(100) NOT NULL )j£z7tableroleroleCREATE TABLE role ( id INTEGER PRIMARY KEY NOT NULL, role varchar(100) NOT NULL )^£y5{indexpodcast_idx_owner_idpodcast&CREATE INDEX podcast_idx_owner_id ON podcast (owner_id)^£xƒtablepodcastpodcast%CREATE TABLE podcast ( id INTEGER PRIMARY KEY NOT NULL, {€A/indextwokeys_belongsto_idx_key1twokeys_belongstoCREATE INDEX twokeys_belongsto_idx_key1 ON twokeys_belongsto (key1)A€U/indexsqlite_autoindex_twokeys_belongsto_1twokeys_belongsto‚ €//ƒGtabletwokeys_belongstotwokeys_belongstoCREATE TABLE twokeys_belongsto ( key1 integer NOT NULL, key2 integer NOT NULL, PRIMARY KEY (key1, key2), FOREIGN KEY (key1) REFERENCES dvd(twokeysfk), FOREIGN KEY (key1) REFERENCES onekey(id) ON DELETE CASCADE )DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/var/placeholder0000644000175000017500000000010113556035664023731 0ustar ahartmaiahartmaiplaceholder for git and dzil which both ignore empty directories DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/pod-coverage.t0000644000175000017500000000030113556035664023476 0ustar ahartmaiahartmaiuse strict; use warnings; use Test::More; eval "use Test::Pod::Coverage 1.04"; plan skip_all => "Test::Pod::Coverage 1.04 required for testing POD coverage" if $@; all_pod_coverage_ok(); DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/0000775000175000017500000000000013556035664021514 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/sqlite.sql0000644000175000017500000001764413556035664023550 0ustar ahartmaiahartmai-- -- Table: artist -- CREATE TABLE artist ( artistid INTEGER PRIMARY KEY NOT NULL, name varchar(100), rank integer NOT NULL DEFAULT '13', charfield char(10) ); -- -- Table: artist_undirected_map -- CREATE TABLE artist_undirected_map ( id1 integer NOT NULL, id2 integer NOT NULL, PRIMARY KEY (id1, id2) ); CREATE INDEX artist_undirected_map_idx_id1_ ON artist_undirected_map (id1); CREATE INDEX artist_undirected_map_idx_id2_ ON artist_undirected_map (id2); -- -- Table: cd_artwork -- CREATE TABLE cd_artwork ( cd_id INTEGER PRIMARY KEY NOT NULL ); CREATE INDEX cd_artwork_idx_cd_id_cd_artwor ON cd_artwork (cd_id); -- -- Table: artwork_to_artist -- CREATE TABLE artwork_to_artist ( artwork_cd_id integer NOT NULL, artist_id integer NOT NULL, PRIMARY KEY (artwork_cd_id, artist_id) ); CREATE INDEX artwork_to_artist_idx_artist_id_artwork_to_arti ON artwork_to_artist (artist_id); CREATE INDEX artwork_to_artist_idx_artwork_cd_id_artwork_to_ ON artwork_to_artist (artwork_cd_id); -- -- Table: bookmark -- CREATE TABLE bookmark ( id INTEGER PRIMARY KEY NOT NULL, link integer NOT NULL ); CREATE INDEX bookmark_idx_link_bookmark ON bookmark (link); -- -- Table: books -- CREATE TABLE books ( id INTEGER PRIMARY KEY NOT NULL, source varchar(100) NOT NULL, owner integer NOT NULL, title varchar(100) NOT NULL, price integer ); -- -- Table: cd -- CREATE TABLE cd ( cdid INTEGER PRIMARY KEY NOT NULL, artist integer NOT NULL, title varchar(100) NOT NULL, year varchar(100) NOT NULL, genreid integer, single_track_id integer ); CREATE INDEX cd_idx_artist_cd ON cd (artist); CREATE INDEX cd_idx_genreid_cd ON cd (genreid); CREATE INDEX cd_idx_single_track_cd ON cd (single_track_id); CREATE UNIQUE INDEX cd_artist_title_cd ON cd (artist, title); -- -- Table: cd_to_producer -- CREATE TABLE cd_to_producer ( cd integer NOT NULL, producer integer NOT NULL, PRIMARY KEY (cd, producer) ); CREATE INDEX cd_to_producer_idx_cd_cd_to_pr ON cd_to_producer (cd); CREATE INDEX cd_to_producer_idx_producer_cd ON cd_to_producer (producer); -- -- Table: collection -- CREATE TABLE collection ( collectionid INTEGER PRIMARY KEY NOT NULL, name varchar(100) NOT NULL ); -- -- Table: collection_object -- CREATE TABLE collection_object ( collection integer NOT NULL, object integer NOT NULL, PRIMARY KEY (collection, object) ); CREATE INDEX collection_object_idx_collection_collection_obj ON collection_object (collection); CREATE INDEX collection_object_idx_object_c ON collection_object (object); -- -- Table: employee -- CREATE TABLE employee ( employee_id INTEGER PRIMARY KEY NOT NULL, position integer NOT NULL, group_id integer, group_id_2 integer, name varchar(100) ); -- -- Table: event -- CREATE TABLE event ( id INTEGER PRIMARY KEY NOT NULL, starts_at datetime NOT NULL, created_on timestamp NOT NULL, varchar_date varchar(20), varchar_datetime varchar(20), skip_inflation datetime ); -- -- Table: file_columns -- CREATE TABLE file_columns ( id INTEGER PRIMARY KEY NOT NULL, file varchar(255) NOT NULL ); -- -- Table: forceforeign -- CREATE TABLE forceforeign ( artist INTEGER PRIMARY KEY NOT NULL, cd integer NOT NULL ); CREATE INDEX forceforeign_idx_artist_forcef ON forceforeign (artist); -- -- Table: fourkeys -- CREATE TABLE fourkeys ( foo integer NOT NULL, bar integer NOT NULL, hello integer NOT NULL, goodbye integer NOT NULL, sensors character NOT NULL, PRIMARY KEY (foo, bar, hello, goodbye) ); -- -- Table: fourkeys_to_twokeys -- CREATE TABLE fourkeys_to_twokeys ( f_foo integer NOT NULL, f_bar integer NOT NULL, f_hello integer NOT NULL, f_goodbye integer NOT NULL, t_artist integer NOT NULL, t_cd integer NOT NULL, autopilot character NOT NULL, PRIMARY KEY (f_foo, f_bar, f_hello, f_goodbye, t_artist, t_cd) ); CREATE INDEX fourkeys_to_twokeys_idx_f_foo_f_bar_f_hello_f_goodbye_ ON fourkeys_to_twokeys (f_foo, f_bar, f_hello, f_goodbye); CREATE INDEX fourkeys_to_twokeys_idx_t_artist_t_cd_fourkeys_to ON fourkeys_to_twokeys (t_artist, t_cd); -- -- Table: genre -- CREATE TABLE genre ( genreid INTEGER PRIMARY KEY NOT NULL, name varchar(100) NOT NULL ); CREATE UNIQUE INDEX genre_name_genre ON genre (name); -- -- Table: images -- CREATE TABLE images ( id INTEGER PRIMARY KEY NOT NULL, artwork_id integer NOT NULL, name varchar(100) NOT NULL, data blob ); CREATE INDEX images_idx_artwork_id_images ON images (artwork_id); -- -- Table: liner_notes -- CREATE TABLE liner_notes ( liner_id INTEGER PRIMARY KEY NOT NULL, notes varchar(100) NOT NULL ); CREATE INDEX liner_notes_idx_liner_id_liner ON liner_notes (liner_id); -- -- Table: link -- CREATE TABLE link ( id INTEGER PRIMARY KEY NOT NULL, url varchar(100), title varchar(100) ); -- -- Table: lyric_versions -- CREATE TABLE lyric_versions ( id INTEGER PRIMARY KEY NOT NULL, lyric_id integer NOT NULL, text varchar(100) NOT NULL ); CREATE INDEX lyric_versions_idx_lyric_id_ly ON lyric_versions (lyric_id); -- -- Table: lyrics -- CREATE TABLE lyrics ( lyric_id INTEGER PRIMARY KEY NOT NULL, track_id integer NOT NULL ); CREATE INDEX lyrics_idx_track_id_lyrics ON lyrics (track_id); -- -- Table: noprimarykey -- CREATE TABLE noprimarykey ( foo integer NOT NULL, bar integer NOT NULL, baz integer NOT NULL ); CREATE UNIQUE INDEX foo_bar_noprimarykey ON noprimarykey (foo, bar); -- -- Table: onekey -- CREATE TABLE onekey ( id INTEGER PRIMARY KEY NOT NULL, artist integer NOT NULL, cd integer NOT NULL ); -- -- Table: owners -- CREATE TABLE owners ( ownerid INTEGER PRIMARY KEY NOT NULL, name varchar(100) NOT NULL ); -- -- Table: producer -- CREATE TABLE producer ( producerid INTEGER PRIMARY KEY NOT NULL, name varchar(100) NOT NULL ); CREATE UNIQUE INDEX prod_name_producer ON producer (name); -- -- Table: self_ref -- CREATE TABLE self_ref ( id INTEGER PRIMARY KEY NOT NULL, name varchar(100) NOT NULL ); -- -- Table: self_ref_alias -- CREATE TABLE self_ref_alias ( self_ref integer NOT NULL, alias integer NOT NULL, PRIMARY KEY (self_ref, alias) ); CREATE INDEX self_ref_alias_idx_alias_self_ ON self_ref_alias (alias); CREATE INDEX self_ref_alias_idx_self_ref_se ON self_ref_alias (self_ref); -- -- Table: sequence_test -- CREATE TABLE sequence_test ( pkid1 integer NOT NULL, pkid2 integer NOT NULL, nonpkid integer NOT NULL, name varchar(100), PRIMARY KEY (pkid1, pkid2) ); -- -- Table: serialized -- CREATE TABLE serialized ( id INTEGER PRIMARY KEY NOT NULL, serialized text NOT NULL ); -- -- Table: tags -- CREATE TABLE tags ( tagid INTEGER PRIMARY KEY NOT NULL, cd integer NOT NULL, tag varchar(100) NOT NULL ); CREATE INDEX tags_idx_cd_tags ON tags (cd); -- -- Table: track -- CREATE TABLE track ( trackid INTEGER PRIMARY KEY NOT NULL, cd integer NOT NULL, position integer NOT NULL, title varchar(100) NOT NULL, last_updated_on datetime ); CREATE INDEX track_idx_cd_track ON track (cd); CREATE UNIQUE INDEX track_cd_position_track ON track (cd, position); CREATE UNIQUE INDEX track_cd_title_track ON track (cd, title); -- -- Table: treelike -- CREATE TABLE treelike ( id INTEGER PRIMARY KEY NOT NULL, parent integer, name varchar(100) NOT NULL ); CREATE INDEX treelike_idx_parent_treelike ON treelike (parent); -- -- Table: twokeytreelike -- CREATE TABLE twokeytreelike ( id1 integer NOT NULL, id2 integer NOT NULL, parent1 integer NOT NULL, parent2 integer NOT NULL, name varchar(100) NOT NULL, PRIMARY KEY (id1, id2) ); CREATE INDEX twokeytreelike_idx_parent1_parent2_twokeytre ON twokeytreelike (parent1, parent2); CREATE UNIQUE INDEX tktlnameunique_twokeytreelike ON twokeytreelike (name); -- -- Table: twokeys -- CREATE TABLE twokeys ( artist integer NOT NULL, cd integer NOT NULL, PRIMARY KEY (artist, cd) ); CREATE INDEX twokeys_idx_artist_twokeys ON twokeys (artist); -- -- Table: typed_object -- CREATE TABLE typed_object ( objectid INTEGER PRIMARY KEY NOT NULL, type varchar(100) NOT NULL, value varchar(100) NOT NULL ); DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest.pm0000644000175000017500000001472013556035664023415 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest; use strict; use warnings; use DBICTest::Schema; =head1 NAME DBICTest - Custom RU package for testing a schema =head1 SYNOPSIS use lib qw(t/lib); use DBICTest; use Test::More; my $schema = DBICTest->init_schema(); =head1 DESCRIPTION This module provides the basic utilities to write tests against DBIx::Class. =cut sub has_custom_dsn { return $ENV{"DBICTEST_DSN"} ? 1:0; } sub _sqlite_dbfilename { return "t/var/DBIxClass.db"; } sub _sqlite_dbname { my $self = shift; my %args = @_; return $self->_sqlite_dbfilename if $args{sqlite_use_file} or $ENV{"DBICTEST_SQLITE_USE_FILE"}; return ":memory:"; } sub _database { my $self = shift; my %args = @_; my $db_file = $self->_sqlite_dbname(%args); unlink($db_file) if -e $db_file; unlink($db_file . "-journal") if -e $db_file . "-journal"; mkdir("t/var") unless -d "t/var"; my $dsn = $ENV{"DBICTEST_DSN"} || "dbi:SQLite:${db_file}"; my $dbuser = $ENV{"DBICTEST_DBUSER"} || ''; my $dbpass = $ENV{"DBICTEST_DBPASS"} || ''; my @connect_info = ($dsn, $dbuser, $dbpass, { AutoCommit => 1 }); return @connect_info; } sub init_schema { my $self = shift; my %args = @_; my $schema; $schema = DBICTest::Schema->compose_namespace('DBICTest'); $schema = $schema->connect($self->_database(%args)); $self->deploy_schema( $schema, $args{deploy_args} ); $self->populate_schema( $schema ); return $schema; } =head2 deploy_schema =cut sub deploy_schema { my $self = shift; my $schema = shift; my $args = shift || {}; open IN, "t/lib/sqlite.sql"; my $sql; { local $/ = undef; $sql = ; } close IN; $schema->storage->txn_begin; for my $chunk ( split (/;\s*\n+/, $sql) ) { if ( $chunk =~ / ^ (?! --\s* ) \S /xm ) { # there is some real sql in the chunk - a non-space at the start of the string which is not a comment $schema->storage->dbh->do($chunk) or print "Error on SQL: $chunk\n"; } } $schema->storage->txn_commit; return; } =head2 populate_schema DBICTest->populate_schema( $schema ); After you deploy your schema you can use this method to populate the tables with test data. =cut sub populate_schema { my $self = shift; my $schema = shift; $schema->populate('Artist', [ [ qw/artistid name/ ], [ 1, 'Caterwauler McCrae' ], [ 2, 'Random Boy Band' ], [ 3, 'We Are Goth' ], ]); $schema->populate('CD', [ [ qw/cdid artist title year/ ], [ 1, 1, "Spoonful of bees", 1999 ], [ 2, 1, "Forkful of bees", 2001 ], [ 3, 1, "Caterwaulin' Blues", 1997 ], [ 4, 2, "Generic Manufactured Singles", 2001 ], [ 5, 3, "Come Be Depressed With Us", 1998 ], ]); $schema->populate('LinerNotes', [ [ qw/liner_id notes/ ], [ 2, "Buy Whiskey!" ], [ 4, "Buy Merch!" ], [ 5, "Kill Yourself!" ], ]); $schema->populate('Tag', [ [ qw/tagid cd tag/ ], [ 1, 1, "Blue" ], [ 2, 2, "Blue" ], [ 3, 3, "Blue" ], [ 4, 5, "Blue" ], [ 5, 2, "Cheesy" ], [ 6, 4, "Cheesy" ], [ 7, 5, "Cheesy" ], [ 8, 2, "Shiny" ], [ 9, 4, "Shiny" ], ]); $schema->populate('TwoKeys', [ [ qw/artist cd/ ], [ 1, 1 ], [ 1, 2 ], [ 2, 2 ], ]); $schema->populate('FourKeys', [ [ qw/foo bar hello goodbye sensors/ ], [ 1, 2, 3, 4, 'online' ], [ 5, 4, 3, 6, 'offline' ], ]); $schema->populate('OneKey', [ [ qw/id artist cd/ ], [ 1, 1, 1 ], [ 2, 1, 2 ], [ 3, 2, 2 ], ]); $schema->populate('SelfRef', [ [ qw/id name/ ], [ 1, 'First' ], [ 2, 'Second' ], ]); $schema->populate('SelfRefAlias', [ [ qw/self_ref alias/ ], [ 1, 2 ] ]); $schema->populate('ArtistUndirectedMap', [ [ qw/id1 id2/ ], [ 1, 2 ] ]); $schema->populate('Producer', [ [ qw/producerid name/ ], [ 1, 'Matt S Trout' ], [ 2, 'Bob The Builder' ], [ 3, 'Fred The Phenotype' ], ]); $schema->populate('CD_to_Producer', [ [ qw/cd producer/ ], [ 1, 1 ], [ 1, 2 ], [ 1, 3 ], ]); $schema->populate('TreeLike', [ [ qw/id parent name/ ], [ 1, undef, 'root' ], [ 2, 1, 'foo' ], [ 3, 2, 'bar' ], [ 6, 2, 'blop' ], [ 4, 3, 'baz' ], [ 5, 4, 'quux' ], [ 7, 3, 'fong' ], ]); $schema->populate('Track', [ [ qw/trackid cd position title/ ], [ 4, 2, 1, "Stung with Success"], [ 5, 2, 2, "Stripy"], [ 6, 2, 3, "Sticky Honey"], [ 7, 3, 1, "Yowlin"], [ 8, 3, 2, "Howlin"], [ 9, 3, 3, "Fowlin"], [ 10, 4, 1, "Boring Name"], [ 11, 4, 2, "Boring Song"], [ 12, 4, 3, "No More Ideas"], [ 13, 5, 1, "Sad"], [ 14, 5, 2, "Under The Weather"], [ 15, 5, 3, "Suicidal"], [ 16, 1, 1, "The Bees Knees"], [ 17, 1, 2, "Apiary"], [ 18, 1, 3, "Beehind You"], ]); $schema->populate('Event', [ [ qw/id starts_at created_on varchar_date varchar_datetime skip_inflation/ ], [ 1, '2006-04-25 22:24:33', '2006-06-22 21:00:05', '2006-07-23', '2006-05-22 19:05:07', '2006-04-21 18:04:06'], ]); $schema->populate('Link', [ [ qw/id url title/ ], [ 1, '', 'aaa' ] ]); $schema->populate('Bookmark', [ [ qw/id link/ ], [ 1, 1 ] ]); $schema->populate('Collection', [ [ qw/collectionid name/ ], [ 1, "Tools" ], [ 2, "Body Parts" ], ]); $schema->populate('TypedObject', [ [ qw/objectid type value/ ], [ 1, "pointy", "Awl" ], [ 2, "round", "Bearing" ], [ 3, "pointy", "Knife" ], [ 4, "pointy", "Tooth" ], [ 5, "round", "Head" ], ]); $schema->populate('CollectionObject', [ [ qw/collection object/ ], [ 1, 1 ], [ 1, 2 ], [ 1, 3 ], [ 2, 4 ], [ 2, 5 ], ]); $schema->populate('Owners', [ [ qw/ownerid name/ ], [ 1, "Newton" ], [ 2, "Waltham" ], ]); $schema->populate('BooksInLibrary', [ [ qw/id owner title source price/ ], [ 1, 1, "Programming Perl", "Library", 23 ], [ 2, 1, "Dynamical Systems", "Library", 37 ], [ 3, 2, "Best Recipe Cookbook", "Library", 65 ], ]); } 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/MySchema.pm0000644000175000017500000000015613556035664023560 0ustar ahartmaiahartmaipackage MySchema; use strict; use warnings; use base 'DBIx::Class::Schema'; __PACKAGE__->load_classes; 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBSchema.pm0000644000175000017500000000026413556035664023460 0ustar ahartmaiahartmaipackage DBSchema; use strict; use warnings; use base 'DBSchemaBase'; __PACKAGE__->load_namespaces( default_resultset_class => '+DBIx::Class::ResultSet::RecursiveUpdate' ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DebugObject.pm0000644000175000017500000000076013556035664024230 0ustar ahartmaiahartmai package DebugObject; sub new { my $class = shift; return bless {messages => []}, $class; } sub print{ my ($self, @messages) = @_; push @{$self->{messages}}, @messages; } sub clear{ $_[0]->{messages} = []; } sub grep_messages{ my ($self, $grep) = @_; return grep { $_ =~ qr/$grep/ } @{$self->{messages}}; } sub get_messages{ $_[0]->{messages}; } sub count_messages{ my ($self, $grep) = @_; return scalar( defined $grep ? $self->grep_messages($grep) : $self->get_messages); } 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBSchemaBase.pm0000644000175000017500000000312713556035664024254 0ustar ahartmaiahartmaipackage DBSchemaBase; use strict; use warnings; use base 'DBIx::Class::Schema'; sub tables_exist { my $dbh = shift; # assume that all tables exist if table dvd is found return $dbh->tables( '%', '%', 'dvd' ); } sub get_test_schema { my ( $class, $dsn, $user, $pass, $opts ) = @_; $dsn ||= 'dbi:SQLite:dbname=t/var/dvdzbr.db'; warn "testing $dsn\n"; my $schema = $class->connect( $dsn, $user, $pass, $opts || {} ); my $deploy_attrs; $deploy_attrs->{add_drop_table} = 1 if tables_exist( $schema->storage->dbh ); $schema->deploy( $deploy_attrs ); $schema->populate('Personality', [ [ qw/user_id / ], [ '1'], [ '2' ], [ '3'], ] ); $schema->populate('User', [ [ qw/username name password / ], [ 'jgda', 'Jonas Alves', ''], [ 'isa' , 'Isa', '', ], [ 'zby' , 'Zbyszek Lukasiak', ''], ] ); $schema->populate('Tag', [ [ qw/name file / ], [ 'comedy', '' ], [ 'dramat', '' ], [ 'australian', '' ], ] ); $schema->populate('Dvd', [ [ qw/name imdb_id owner current_borrower creation_date alter_date / ], [ 'Picnick under the Hanging Rock', 123, 1, 3, '2003-01-16 23:12:01', undef ], [ 'The Deerhunter', 1234, 1, 1, undef, undef ], [ 'Rejs', 1235, 3, 1, undef, undef ], [ 'Seksmisja', 1236, 3, 1, undef, undef ], ] ); $schema->populate( 'Dvdtag', [ [ qw/ dvd tag / ], [ 1, 2 ], [ 1, 3 ], [ 3, 1 ], [ 4, 1 ], ] ); return $schema; } 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBSchemaMoose.pm0000644000175000017500000000033313556035664024460 0ustar ahartmaiahartmaipackage DBSchemaMoose; use strict; use warnings; use base 'DBSchemaBase'; __PACKAGE__->load_namespaces( result_namespace => '+DBSchema::Result', default_resultset_class => '+DBSchemaMoose::ResultSet', ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/MySchema/0000775000175000017500000000000013556035664023222 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/MySchema/Test.pm0000644000175000017500000000137113556035664024477 0ustar ahartmaiahartmaipackage MySchema::Test; use strict; use warnings; use base 'DBIx::Class'; __PACKAGE__->load_components(qw/ InflateColumn::DateTime PK::Auto Core /); __PACKAGE__->table("test"); __PACKAGE__->add_columns( hidden_col => { data_type => "INTEGER" }, text_col => { data_type => "TEXT" }, password_col => { data_type => "TEXT" }, checkbox_col => { data_type => "TEXT", default_value => 0, is_nullable => 0, }, select_col => { data_type => "TEXT" }, radio_col => { data_type => "TEXT" }, radiogroup_col => { data_type => "TEXT" }, date_col => { data_type => "DATE" }, not_in_form => { data_type => "TEXT" }, ); __PACKAGE__->set_primary_key("hidden_col"); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/0000775000175000017500000000000013556035664023055 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Stats.pm0000644000175000017500000000167313556035664024516 0ustar ahartmaiahartmaipackage DBICTest::Stats; use strict; use warnings; use base qw/DBIx::Class::Storage::Statistics/; sub txn_begin { my $self = shift; $self->{'TXN_BEGIN'}++; return $self->{'TXN_BEGIN'}; } sub txn_rollback { my $self = shift; $self->{'TXN_ROLLBACK'}++; return $self->{'TXN_ROLLBACK'}; } sub txn_commit { my $self = shift; $self->{'TXN_COMMIT'}++; return $self->{'TXN_COMMIT'}; } sub svp_begin { my ($self, $name) = @_; $self->{'SVP_BEGIN'}++; return $self->{'SVP_BEGIN'}; } sub svp_release { my ($self, $name) = @_; $self->{'SVP_RELEASE'}++; return $self->{'SVP_RELEASE'}; } sub svp_rollback { my ($self, $name) = @_; $self->{'SVP_ROLLBACK'}++; return $self->{'SVP_ROLLBACK'}; } sub query_start { my ($self, $string, @bind) = @_; $self->{'QUERY_START'}++; return $self->{'QUERY_START'}; } sub query_end { my ($self, $string) = @_; $self->{'QUERY_END'}++; return $self->{'QUERY_START'}; } 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Plain.pm0000644000175000017500000000130013556035664024446 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Plain; use strict; use warnings; use base qw/DBIx::Class::Schema/; use DBI; my $db_file = "t/var/Plain.db"; unlink($db_file) if -e $db_file; unlink($db_file . "-journal") if -e $db_file . "-journal"; mkdir("t/var") unless -d "t/var"; my $dsn = "dbi:SQLite:${db_file}"; __PACKAGE__->load_classes("Test"); my $schema = __PACKAGE__->compose_connection( __PACKAGE__, $dsn, undef, undef, { AutoCommit => 1 } ); my $dbh = DBI->connect($dsn); my $sql = <do($_) for split(/\n\n/, $sql); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema.pm0000644000175000017500000000155013556035664024612 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema; use base qw/DBIx::Class::Schema/; no warnings qw/qw/; __PACKAGE__->load_classes(qw/ Artist SequenceTest Employee CD FileColumn Genre Link Bookmark #dummy Track Tag /, { 'DBICTest::Schema' => [qw/ LinerNotes Artwork Artwork_to_Artist Image Lyrics LyricVersion OneKey #dummy TwoKeys Serialized /]}, ( 'FourKeys', 'FourKeys_to_TwoKeys', '#dummy', 'SelfRef', 'ArtistUndirectedMap', 'ArtistSourceName', 'ArtistSubclass', 'Producer', 'CD_to_Producer', ), qw/SelfRefAlias TreeLike TwoKeyTreeLike Event EventTZ NoPrimaryKey/, qw/Collection CollectionObject TypedObject Owners BooksInLibrary/, qw/ForceForeign/, ); sub sqlt_deploy_hook { my ($self, $sqlt_schema) = @_; $sqlt_schema->drop_table('dummy'); } 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/0000775000175000017500000000000013556035664024255 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/CD.pm0000644000175000017500000000420113556035664025074 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::CD; use base 'DBIx::Class::Core'; __PACKAGE__->table('cd'); __PACKAGE__->add_columns( 'cdid' => { data_type => 'integer', is_auto_increment => 1, }, 'artist' => { data_type => 'integer', }, 'title' => { data_type => 'varchar', size => 100, }, 'year' => { data_type => 'varchar', size => 100, }, 'genreid' => { data_type => 'integer', is_nullable => 1, }, 'single_track_id' => { data_type => 'integer', is_nullable => 1, is_foreign_key => 1, } ); __PACKAGE__->set_primary_key('cdid'); __PACKAGE__->add_unique_constraint([ qw/artist title/ ]); __PACKAGE__->belongs_to( artist => 'DBICTest::Schema::Artist', {artistid => "artist"}, { is_deferrable => 1, accessor => "single", }); # in case this is a single-cd it promotes a track from another cd __PACKAGE__->belongs_to( single_track => 'DBICTest::Schema::Track', {'foreign.trackid'=>'self.single_track_id'}, ); __PACKAGE__->has_many( tracks => 'DBICTest::Schema::Track' ); __PACKAGE__->has_many( tags => 'DBICTest::Schema::Tag', undef, { order_by => 'tag' }, ); __PACKAGE__->has_many( cd_to_producer => 'DBICTest::Schema::CD_to_Producer' => 'cd' ); __PACKAGE__->might_have( liner_notes => 'DBICTest::Schema::LinerNotes', undef, { proxy => [ qw/notes/ ] }, ); __PACKAGE__->might_have(artwork => 'DBICTest::Schema::Artwork', 'cd_id'); __PACKAGE__->many_to_many( producers => cd_to_producer => 'producer' ); __PACKAGE__->many_to_many( producers_sorted => cd_to_producer => 'producer', { order_by => 'producer.name' }, ); __PACKAGE__->belongs_to('genre', 'DBICTest::Schema::Genre', { 'foreign.genreid' => 'self.genreid' }, { join_type => 'left', on_delete => 'SET NULL', on_update => 'CASCADE', }, ); #__PACKAGE__->add_relationship('genre', 'DBICTest::Schema::Genre', # { 'foreign.genreid' => 'self.genreid' }, # { 'accessor' => 'single' } #); __PACKAGE__->resultset_class( __PACKAGE__ . '::ResultSet'); package DBICTest::Schema::CD::ResultSet; use base qw( DBIx::Class::ResultSet::RecursiveUpdate ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/Tag.pm0000644000175000017500000000112513556035664025323 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::Tag; use base qw/DBIx::Class::Core/; __PACKAGE__->table('tags'); __PACKAGE__->add_columns( 'tagid' => { data_type => 'integer', is_auto_increment => 1, }, 'cd' => { data_type => 'integer', }, 'tag' => { data_type => 'varchar', size => 100, }, ); __PACKAGE__->set_primary_key('tagid'); __PACKAGE__->belongs_to( cd => 'DBICTest::Schema::CD' ); __PACKAGE__->resultset_class( __PACKAGE__ . '::ResultSet'); package DBICTest::Schema::Tag::ResultSet; use base qw( DBIx::Class::ResultSet::RecursiveUpdate ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/Link.pm0000644000175000017500000000132613556035664025510 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::Link; use base 'DBIx::Class::Core'; use strict; use warnings; __PACKAGE__->table('link'); __PACKAGE__->add_columns( 'id' => { data_type => 'integer', is_auto_increment => 1 }, 'url' => { data_type => 'varchar', size => 100, is_nullable => 1, }, 'title' => { data_type => 'varchar', size => 100, is_nullable => 1, }, ); __PACKAGE__->set_primary_key('id'); use overload '""' => sub { shift->url }, fallback=> 1; __PACKAGE__->resultset_class( __PACKAGE__ . '::ResultSet'); package DBICTest::Schema::Link::ResultSet; use base qw( DBIx::Class::ResultSet::RecursiveUpdate ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/Track.pm0000644000175000017500000000215113556035664025654 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::Track; use base 'DBIx::Class::Core'; __PACKAGE__->table('track'); __PACKAGE__->add_columns( 'trackid' => { data_type => 'integer', is_auto_increment => 1, }, 'cd' => { data_type => 'integer', }, 'position' => { data_type => 'integer', accessor => 'pos', }, 'title' => { data_type => 'varchar', size => 100, }, last_updated_on => { data_type => 'datetime', accessor => 'updated_date', is_nullable => 1 }, ); __PACKAGE__->set_primary_key('trackid'); __PACKAGE__->add_unique_constraint([ qw/cd position/ ]); __PACKAGE__->add_unique_constraint([ qw/cd title/ ]); __PACKAGE__->belongs_to( cd => 'DBICTest::Schema::CD' ); __PACKAGE__->belongs_to( disc => 'DBICTest::Schema::CD' => 'cd'); __PACKAGE__->might_have( cd_single => 'DBICTest::Schema::CD', 'single_track_id' ); __PACKAGE__->might_have( lyrics => 'DBICTest::Schema::Lyrics', 'track_id' ); __PACKAGE__->resultset_class( __PACKAGE__ . '::ResultSet'); package DBICTest::Schema::Track::ResultSet; use base qw( DBIx::Class::ResultSet::RecursiveUpdate ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/Dummy.pm0000644000175000017500000000062613556035664025710 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::Dummy; use base 'DBIx::Class::Core'; use strict; use warnings; __PACKAGE__->table('dummy'); __PACKAGE__->add_columns( 'id' => { data_type => 'integer', is_auto_increment => 1 }, 'gittery' => { data_type => 'varchar', size => 100, is_nullable => 1, }, ); __PACKAGE__->set_primary_key('id'); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/Event.pm0000644000175000017500000000132313556035664025671 0ustar ahartmaiahartmaipackage DBICTest::Schema::Event; use strict; use warnings; use base qw/DBIx::Class::Core/; __PACKAGE__->load_components(qw/InflateColumn::DateTime/); __PACKAGE__->table('event'); __PACKAGE__->add_columns( id => { data_type => 'integer', is_auto_increment => 1 }, starts_at => { data_type => 'datetime', datetime_undef_if_invalid => 1 }, created_on => { data_type => 'timestamp' }, varchar_date => { data_type => 'varchar', inflate_date => 1, size => 20, is_nullable => 1 }, varchar_datetime => { data_type => 'varchar', inflate_datetime => 1, size => 20, is_nullable => 1 }, skip_inflation => { data_type => 'datetime', inflate_datetime => 0, is_nullable => 1 }, ); __PACKAGE__->set_primary_key('id'); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/Genre.pm0000644000175000017500000000116713556035664025656 0ustar ahartmaiahartmaipackage DBICTest::Schema::Genre; use strict; use base 'DBIx::Class::Core'; __PACKAGE__->table('genre'); __PACKAGE__->add_columns( genreid => { data_type => 'integer', is_auto_increment => 1, }, name => { data_type => 'varchar', size => 100, }, ); __PACKAGE__->set_primary_key('genreid'); __PACKAGE__->add_unique_constraint ( genre_name => [qw/name/] ); __PACKAGE__->has_many (cds => 'DBICTest::Schema::CD', 'genreid'); __PACKAGE__->resultset_class( __PACKAGE__ . '::ResultSet'); package DBICTest::Schema::Genre::ResultSet; use base qw( DBIx::Class::ResultSet::RecursiveUpdate ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/Image.pm0000644000175000017500000000131113556035664025627 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::Image; use base qw/DBIx::Class::Core/; __PACKAGE__->table('images'); __PACKAGE__->add_columns( 'id' => { data_type => 'integer', is_auto_increment => 1, }, 'artwork_id' => { data_type => 'integer', is_foreign_key => 1, }, 'name' => { data_type => 'varchar', size => 100, }, 'data' => { data_type => 'blob', is_nullable => 1, }, ); __PACKAGE__->set_primary_key('id'); __PACKAGE__->belongs_to('artwork', 'DBICTest::Schema::Artwork', 'artwork_id'); __PACKAGE__->resultset_class( __PACKAGE__ . '::ResultSet'); package DBICTest::Schema::Image::ResultSet; use base qw( DBIx::Class::ResultSet::RecursiveUpdate ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/Owners.pm0000644000175000017500000000067313556035664026074 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::Owners; use base qw/DBIx::Class::Core/; __PACKAGE__->table('owners'); __PACKAGE__->add_columns( 'ownerid' => { data_type => 'integer', is_auto_increment => 1, }, 'name' => { data_type => 'varchar', size => '100', }, ); __PACKAGE__->set_primary_key('ownerid'); __PACKAGE__->has_many(books => "DBICTest::Schema::BooksInLibrary", "owner"); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/Artist.pm0000644000175000017500000000361013556035664026057 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::Artist; use base 'DBIx::Class::Core'; __PACKAGE__->table('artist'); __PACKAGE__->source_info({ "source_info_key_A" => "source_info_value_A", "source_info_key_B" => "source_info_value_B", "source_info_key_C" => "source_info_value_C", }); __PACKAGE__->add_columns( 'artistid' => { data_type => 'integer', is_auto_increment => 1, }, 'name' => { data_type => 'varchar', size => 100, is_nullable => 1, }, rank => { data_type => 'integer', default_value => 13, }, charfield => { data_type => 'char', size => 10, is_nullable => 1, }, ); __PACKAGE__->set_primary_key('artistid'); __PACKAGE__->mk_classdata('field_name_for', { artistid => 'primary key', name => 'artist name', }); __PACKAGE__->has_many( cds => 'DBICTest::Schema::CD', undef, { order_by => 'year' }, ); __PACKAGE__->has_many( cds_unordered => 'DBICTest::Schema::CD' ); __PACKAGE__->has_many( twokeys => 'DBICTest::Schema::TwoKeys' ); __PACKAGE__->has_many( onekeys => 'DBICTest::Schema::OneKey' ); __PACKAGE__->has_many( artist_undirected_maps => 'DBICTest::Schema::ArtistUndirectedMap', [ {'foreign.id1' => 'self.artistid'}, {'foreign.id2' => 'self.artistid'} ], { cascade_copy => 0 } # this would *so* not make sense ); __PACKAGE__->has_many( artist_to_artwork => 'DBICTest::Schema::Artwork_to_Artist' => 'artist_id' ); __PACKAGE__->many_to_many('artworks', 'artist_to_artwork', 'artwork'); sub sqlt_deploy_hook { my ($self, $sqlt_table) = @_; if ($sqlt_table->schema->translator->producer_type =~ /SQLite$/ ) { $sqlt_table->add_index( name => 'artist_name', fields => ['name'] ) or die $sqlt_table->error; } } __PACKAGE__->resultset_class( __PACKAGE__ . '::ResultSet'); package DBICTest::Schema::Artist::ResultSet; use base qw( DBIx::Class::ResultSet::RecursiveUpdate ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/OneKey.pm0000644000175000017500000000054413556035664026006 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::OneKey; use base 'DBIx::Class::Core'; __PACKAGE__->table('onekey'); __PACKAGE__->add_columns( 'id' => { data_type => 'integer', is_auto_increment => 1, }, 'artist' => { data_type => 'integer', }, 'cd' => { data_type => 'integer', }, ); __PACKAGE__->set_primary_key('id'); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/Lyrics.pm0000644000175000017500000000124413556035664026057 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::Lyrics; use base qw/DBIx::Class::Core/; __PACKAGE__->table('lyrics'); __PACKAGE__->add_columns( 'lyric_id' => { data_type => 'integer', is_auto_increment => 1, }, 'track_id' => { data_type => 'integer', is_foreign_key => 1, }, ); __PACKAGE__->set_primary_key('lyric_id'); __PACKAGE__->belongs_to('track', 'DBICTest::Schema::Track', 'track_id'); __PACKAGE__->has_many('lyric_versions', 'DBICTest::Schema::LyricVersion', 'lyric_id'); __PACKAGE__->resultset_class( __PACKAGE__ . '::ResultSet'); package DBICTest::Schema::Lyrics::ResultSet; use base qw( DBIx::Class::ResultSet::RecursiveUpdate ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/Artwork.pm0000644000175000017500000000131713556035664026244 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::Artwork; use base qw/DBIx::Class::Core/; __PACKAGE__->table('cd_artwork'); __PACKAGE__->add_columns( 'cd_id' => { data_type => 'integer', }, ); __PACKAGE__->set_primary_key('cd_id'); __PACKAGE__->belongs_to('cd', 'DBICTest::Schema::CD', 'cd_id'); __PACKAGE__->has_many('images', 'DBICTest::Schema::Image', 'artwork_id'); __PACKAGE__->has_many('artwork_to_artist', 'DBICTest::Schema::Artwork_to_Artist', 'artwork_cd_id'); __PACKAGE__->many_to_many('artists', 'artwork_to_artist', 'artist'); __PACKAGE__->resultset_class( __PACKAGE__ . '::ResultSet'); package DBICTest::Schema::Artwork::ResultSet; use base qw( DBIx::Class::ResultSet::RecursiveUpdate ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/TwoKeys.pm0000755000175000017500000000135413556035664026224 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::TwoKeys; use base 'DBIx::Class::Core'; __PACKAGE__->table('twokeys'); __PACKAGE__->add_columns( 'artist' => { data_type => 'integer' }, 'cd' => { data_type => 'integer' }, ); __PACKAGE__->set_primary_key(qw/artist cd/); __PACKAGE__->belongs_to( artist => 'DBICTest::Schema::Artist', {'foreign.artistid'=>'self.artist'}, ); __PACKAGE__->belongs_to( cd => 'DBICTest::Schema::CD', undef, { is_deferrable => 0, add_fk_index => 0 } ); __PACKAGE__->has_many( 'fourkeys_to_twokeys', 'DBICTest::Schema::FourKeys_to_TwoKeys', { 'foreign.t_artist' => 'self.artist', 'foreign.t_cd' => 'self.cd', }); __PACKAGE__->many_to_many( 'fourkeys', 'fourkeys_to_twokeys', 'fourkeys', ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/EventTZ.pm0000644000175000017500000000075013556035664026152 0ustar ahartmaiahartmaipackage DBICTest::Schema::EventTZ; use strict; use warnings; use base qw/DBIx::Class::Core/; __PACKAGE__->load_components(qw/InflateColumn::DateTime/); __PACKAGE__->table('event'); __PACKAGE__->add_columns( id => { data_type => 'integer', is_auto_increment => 1 }, starts_at => { data_type => 'datetime', timezone => "America/Chicago" }, created_on => { data_type => 'timestamp', timezone => "America/Chicago", floating_tz_ok => 1 }, ); __PACKAGE__->set_primary_key('id'); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/SelfRef.pm0000644000175000017500000000066713556035664026150 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::SelfRef; use base 'DBIx::Class::Core'; __PACKAGE__->table('self_ref'); __PACKAGE__->add_columns( 'id' => { data_type => 'integer', is_auto_increment => 1, }, 'name' => { data_type => 'varchar', size => 100, }, ); __PACKAGE__->set_primary_key('id'); __PACKAGE__->has_many( aliases => 'DBICTest::Schema::SelfRefAlias' => 'self_ref' ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/FourKeys.pm0000644000175000017500000000135213556035664026361 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::FourKeys; use base 'DBIx::Class::Core'; __PACKAGE__->table('fourkeys'); __PACKAGE__->add_columns( 'foo' => { data_type => 'integer' }, 'bar' => { data_type => 'integer' }, 'hello' => { data_type => 'integer' }, 'goodbye' => { data_type => 'integer' }, 'sensors' => { data_type => 'character' }, ); __PACKAGE__->set_primary_key(qw/foo bar hello goodbye/); __PACKAGE__->has_many( 'fourkeys_to_twokeys', 'DBICTest::Schema::FourKeys_to_TwoKeys', { 'foreign.f_foo' => 'self.foo', 'foreign.f_bar' => 'self.bar', 'foreign.f_hello' => 'self.hello', 'foreign.f_goodbye' => 'self.goodbye', }); __PACKAGE__->many_to_many( 'twokeys', 'fourkeys_to_twokeys', 'twokeys', ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/Producer.pm0000644000175000017500000000133713556035664026400 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::Producer; use base 'DBIx::Class::Core'; __PACKAGE__->table('producer'); __PACKAGE__->add_columns( 'producerid' => { data_type => 'integer', is_auto_increment => 1 }, 'name' => { data_type => 'varchar', size => 100, }, ); __PACKAGE__->set_primary_key('producerid'); __PACKAGE__->add_unique_constraint(prod_name => [ qw/name/ ]); __PACKAGE__->has_many( producer_to_cd => 'DBICTest::Schema::CD_to_Producer' => 'producer' ); __PACKAGE__->many_to_many('cds', 'producer_to_cd', 'cd'); __PACKAGE__->resultset_class( __PACKAGE__ . '::ResultSet'); package DBICTest::Schema::Producer::ResultSet; use base qw( DBIx::Class::ResultSet::RecursiveUpdate ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/Employee.pm0000644000175000017500000000174013556035664026372 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::Employee; use base 'DBIx::Class::Core'; __PACKAGE__->load_components(qw( Ordered )); __PACKAGE__->table('employee'); __PACKAGE__->add_columns( employee_id => { data_type => 'integer', is_auto_increment => 1 }, position => { data_type => 'integer', }, group_id => { data_type => 'integer', is_nullable => 1, }, group_id_2 => { data_type => 'integer', is_nullable => 1, }, name => { data_type => 'varchar', size => 100, is_nullable => 1, }, ); __PACKAGE__->set_primary_key('employee_id'); __PACKAGE__->position_column('position'); #__PACKAGE__->add_unique_constraint(position_group => [ qw/position group_id/ ]); __PACKAGE__->mk_classdata('field_name_for', { employee_id => 'primary key', position => 'list position', group_id => 'collection column', name => 'employee name', }); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/Bookmark.pm0000644000175000017500000000111713556035664026356 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::Bookmark; use base 'DBIx::Class::Core'; use strict; use warnings; __PACKAGE__->table('bookmark'); __PACKAGE__->add_columns( 'id' => { data_type => 'integer', is_auto_increment => 1 }, 'link' => { data_type => 'integer', }, ); __PACKAGE__->set_primary_key('id'); __PACKAGE__->belongs_to(link => 'DBICTest::Schema::Link' ); __PACKAGE__->resultset_class( __PACKAGE__ . '::ResultSet'); package DBICTest::Schema::Bookmark::ResultSet; use base qw( DBIx::Class::ResultSet::RecursiveUpdate ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/TreeLike.pm0000644000175000017500000000160313556035664026315 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::TreeLike; use base qw/DBIx::Class::Core/; __PACKAGE__->table('treelike'); __PACKAGE__->add_columns( 'id' => { data_type => 'integer', is_auto_increment => 1 }, 'parent' => { data_type => 'integer' , is_nullable=>1}, 'name' => { data_type => 'varchar', size => 100, }, ); __PACKAGE__->set_primary_key(qw/id/); __PACKAGE__->belongs_to('parent', 'TreeLike', { 'foreign.id' => 'self.parent' }); __PACKAGE__->has_many('children', 'TreeLike', { 'foreign.parent' => 'self.id' }); ## since this is a self referential table we need to do a post deploy hook and get ## some data in while constraints are off sub sqlt_deploy_hook { my ($self, $sqlt_table) = @_; ## We don't seem to need this anymore, but keeping it for the moment ## $sqlt_table->add_index(name => 'idx_name', fields => ['name']); } 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/Serialized.pm0000644000175000017500000000042513556035664026705 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::Serialized; use base 'DBIx::Class::Core'; __PACKAGE__->table('serialized'); __PACKAGE__->add_columns( 'id' => { data_type => 'integer' }, 'serialized' => { data_type => 'text' }, ); __PACKAGE__->set_primary_key('id'); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/FileColumn.pm0000644000175000017500000000056413556035664026653 0ustar ahartmaiahartmaipackage DBICTest::Schema::FileColumn; use strict; use warnings; use base qw/DBIx::Class::Core/; use File::Temp qw/tempdir/; __PACKAGE__->table('file_columns'); __PACKAGE__->add_columns( id => { data_type => 'integer', is_auto_increment => 1 }, file => { data_type => 'varchar', size => 255 } ); __PACKAGE__->set_primary_key('id'); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/LinerNotes.pm0000644000175000017500000000106713556035664026677 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::LinerNotes; use base qw/DBIx::Class::Core/; __PACKAGE__->table('liner_notes'); __PACKAGE__->add_columns( 'liner_id' => { data_type => 'integer', }, 'notes' => { data_type => 'varchar', size => 100, }, ); __PACKAGE__->set_primary_key('liner_id'); __PACKAGE__->belongs_to( 'cd', 'DBICTest::Schema::CD', 'liner_id' ); __PACKAGE__->resultset_class( __PACKAGE__ . '::ResultSet'); package DBICTest::Schema::LinerNotes::ResultSet; use base qw( DBIx::Class::ResultSet::RecursiveUpdate ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/Collection.pm0000644000175000017500000000173213556035664026707 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::Collection; use base qw/DBIx::Class::Core/; __PACKAGE__->table('collection'); __PACKAGE__->add_columns( 'collectionid' => { data_type => 'integer', is_auto_increment => 1, }, 'name' => { data_type => 'varchar', size => 100, }, ); __PACKAGE__->set_primary_key('collectionid'); __PACKAGE__->has_many( collection_object => "DBICTest::Schema::CollectionObject", { "foreign.collection" => "self.collectionid" } ); __PACKAGE__->many_to_many( objects => collection_object => "object" ); __PACKAGE__->many_to_many( pointy_objects => collection_object => "object", { where => { "object.type" => "pointy" } } ); __PACKAGE__->many_to_many( round_objects => collection_object => "object", { where => { "object.type" => "round" } } ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/NoSuchClass.pm0000644000175000017500000000017113556035664026775 0ustar ahartmaiahartmaipackage DBICTest::Schema::NoSuchClass; ## This is purposefully not a real DBIC class ## Used in t/102load_classes.t 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/TypedObject.pm0000644000175000017500000000124413556035664027026 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::TypedObject; use base qw/DBIx::Class::Core/; __PACKAGE__->table('typed_object'); __PACKAGE__->add_columns( 'objectid' => { data_type => 'integer', is_auto_increment => 1, }, 'type' => { data_type => 'varchar', size => '100', }, 'value' => { data_type => 'varchar', size => 100, }, ); __PACKAGE__->set_primary_key('objectid'); __PACKAGE__->has_many( collection_object => "DBICTest::Schema::CollectionObject", { "foreign.object" => "self.objectid" } ); __PACKAGE__->many_to_many( collections => collection_object => "collection" ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/NoPrimaryKey.pm0000644000175000017500000000053213556035664027202 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::NoPrimaryKey; use base 'DBIx::Class::Core'; __PACKAGE__->table('noprimarykey'); __PACKAGE__->add_columns( 'foo' => { data_type => 'integer' }, 'bar' => { data_type => 'integer' }, 'baz' => { data_type => 'integer' }, ); __PACKAGE__->add_unique_constraint(foo_bar => [ qw/foo bar/ ]); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/LyricVersion.pm0000644000175000017500000000122513556035664027241 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::LyricVersion; use base qw/DBIx::Class::Core/; __PACKAGE__->table('lyric_versions'); __PACKAGE__->add_columns( 'id' => { data_type => 'integer', is_auto_increment => 1, }, 'lyric_id' => { data_type => 'integer', is_foreign_key => 1, }, 'text' => { data_type => 'varchar', size => 100, }, ); __PACKAGE__->set_primary_key('id'); __PACKAGE__->belongs_to('lyric', 'DBICTest::Schema::Lyrics', 'lyric_id'); __PACKAGE__->resultset_class( __PACKAGE__ . '::ResultSet'); package DBICTest::Schema::LyricVersion::ResultSet; use base qw( DBIx::Class::ResultSet::RecursiveUpdate ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/SelfRefAlias.pm0000644000175000017500000000072513556035664027115 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::SelfRefAlias; use base 'DBIx::Class::Core'; __PACKAGE__->table('self_ref_alias'); __PACKAGE__->add_columns( 'self_ref' => { data_type => 'integer', }, 'alias' => { data_type => 'integer', }, ); __PACKAGE__->set_primary_key(qw/self_ref alias/); __PACKAGE__->belongs_to( self_ref => 'DBICTest::Schema::SelfRef' ); __PACKAGE__->belongs_to( alias => 'DBICTest::Schema::SelfRef' ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/ForceForeign.pm0000644000175000017500000000165113556035664027164 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::ForceForeign; use base 'DBIx::Class::Core'; __PACKAGE__->table('forceforeign'); __PACKAGE__->add_columns( 'artist' => { data_type => 'integer' }, 'cd' => { data_type => 'integer' }, ); __PACKAGE__->set_primary_key(qw/artist/); # Normally this would not appear as a FK constraint # since it uses the PK __PACKAGE__->might_have( 'artist_1', 'DBICTest::Schema::Artist', { 'foreign.artistid' => 'self.artist', }, { is_foreign_key_constraint => 1, }, ); # Normally this would appear as a FK constraint __PACKAGE__->might_have( 'cd_1', 'DBICTest::Schema::CD', { 'foreign.cdid' => 'self.cd', }, { is_foreign_key_constraint => 0, }, ); # Normally this would appear as a FK constraint __PACKAGE__->belongs_to( 'cd_3', 'DBICTest::Schema::CD', { 'foreign.cdid' => 'self.cd', }, { is_foreign_key_constraint => 0, }, ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/SequenceTest.pm0000644000175000017500000000150713556035664027224 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::SequenceTest; use base 'DBIx::Class::Core'; __PACKAGE__->table('sequence_test'); __PACKAGE__->source_info({ "source_info_key_A" => "source_info_value_A", "source_info_key_B" => "source_info_value_B", "source_info_key_C" => "source_info_value_C", "source_info_key_D" => "source_info_value_D", }); __PACKAGE__->add_columns( 'pkid1' => { data_type => 'integer', auto_nextval => 1, sequence => 'pkid1_seq', }, 'pkid2' => { data_type => 'integer', auto_nextval => 1, sequence => 'pkid2_seq', }, 'nonpkid' => { data_type => 'integer', auto_nextval => 1, sequence => 'nonpkid_seq', }, 'name' => { data_type => 'varchar', size => 100, is_nullable => 1, }, ); __PACKAGE__->set_primary_key('pkid1', 'pkid2'); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/CD_to_Producer.pm0000644000175000017500000000132713556035664027447 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::CD_to_Producer; use base 'DBIx::Class::Core'; __PACKAGE__->table('cd_to_producer'); __PACKAGE__->add_columns( cd => { data_type => 'integer' }, producer => { data_type => 'integer' }, ); __PACKAGE__->set_primary_key(qw/cd producer/); __PACKAGE__->belongs_to( 'cd', 'DBICTest::Schema::CD', { 'foreign.cdid' => 'self.cd' } ); __PACKAGE__->belongs_to( 'producer', 'DBICTest::Schema::Producer', { 'foreign.producerid' => 'self.producer' }, { on_delete => undef, on_update => undef }, ); __PACKAGE__->resultset_class( __PACKAGE__ . '::ResultSet'); package DBICTest::Schema::CD_to_Producer::ResultSet; use base qw( DBIx::Class::ResultSet::RecursiveUpdate ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/ArtistSubclass.pm0000644000175000017500000000022213556035664027553 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::ArtistSubclass; use base 'DBICTest::Schema::Artist'; __PACKAGE__->table(__PACKAGE__->table); 1;DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/TwoKeyTreeLike.pm0000644000175000017500000000124113556035664027456 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::TwoKeyTreeLike; use base qw/DBIx::Class::Core/; __PACKAGE__->table('twokeytreelike'); __PACKAGE__->add_columns( 'id1' => { data_type => 'integer' }, 'id2' => { data_type => 'integer' }, 'parent1' => { data_type => 'integer' }, 'parent2' => { data_type => 'integer' }, 'name' => { data_type => 'varchar', size => 100, }, ); __PACKAGE__->set_primary_key(qw/id1 id2/); __PACKAGE__->add_unique_constraint('tktlnameunique' => ['name']); __PACKAGE__->belongs_to('parent', 'DBICTest::Schema::TwoKeyTreeLike', { 'foreign.id1' => 'self.parent1', 'foreign.id2' => 'self.parent2'}); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/BooksInLibrary.pm0000644000175000017500000000117613556035664027507 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::BooksInLibrary; use base qw/DBIx::Class::Core/; __PACKAGE__->table('books'); __PACKAGE__->add_columns( 'id' => { data_type => 'integer', is_auto_increment => 1, }, 'source' => { data_type => 'varchar', size => '100', }, 'owner' => { data_type => 'integer', }, 'title' => { data_type => 'varchar', size => '100', }, 'price' => { data_type => 'integer', is_nullable => 1, }, ); __PACKAGE__->set_primary_key('id'); __PACKAGE__->resultset_attributes({where => { source => "Library" } }); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/CollectionObject.pm0000644000175000017500000000125413556035664030035 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::CollectionObject; use base qw/DBIx::Class::Core/; __PACKAGE__->table('collection_object'); __PACKAGE__->add_columns( 'collection' => { data_type => 'integer', }, 'object' => { data_type => 'integer', }, ); __PACKAGE__->set_primary_key(qw/collection object/); __PACKAGE__->belongs_to( collection => "DBICTest::Schema::Collection", { "foreign.collectionid" => "self.collection" } ); __PACKAGE__->belongs_to( object => "DBICTest::Schema::TypedObject", { "foreign.objectid" => "self.object" } ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/ArtistSourceName.pm0000644000175000017500000000030313556035664030035 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::ArtistSourceName; use base 'DBICTest::Schema::Artist'; __PACKAGE__->table(__PACKAGE__->table); __PACKAGE__->source_name('SourceNameArtists'); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/Artwork_to_Artist.pm0000644000175000017500000000132613556035664030274 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::Artwork_to_Artist; use base qw/DBIx::Class::Core/; __PACKAGE__->table('artwork_to_artist'); __PACKAGE__->add_columns( 'artwork_cd_id' => { data_type => 'integer', is_foreign_key => 1, }, 'artist_id' => { data_type => 'integer', is_foreign_key => 1, }, ); __PACKAGE__->set_primary_key(qw/artwork_cd_id artist_id/); __PACKAGE__->belongs_to('artwork', 'DBICTest::Schema::Artwork', 'artwork_cd_id'); __PACKAGE__->belongs_to('artist', 'DBICTest::Schema::Artist', 'artist_id'); __PACKAGE__->resultset_class( __PACKAGE__ . '::ResultSet'); package DBICTest::Schema::Artwork_to_Artist::ResultSet; use base qw( DBIx::Class::ResultSet::RecursiveUpdate ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/ArtistUndirectedMap.pm0000644000175000017500000000127313556035664030527 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::ArtistUndirectedMap; use base 'DBIx::Class::Core'; __PACKAGE__->table('artist_undirected_map'); __PACKAGE__->add_columns( 'id1' => { data_type => 'integer' }, 'id2' => { data_type => 'integer' }, ); __PACKAGE__->set_primary_key(qw/id1 id2/); __PACKAGE__->belongs_to( 'artist1', 'DBICTest::Schema::Artist', 'id1', { on_delete => 'RESTRICT', on_update => 'CASCADE'} ); __PACKAGE__->belongs_to( 'artist2', 'DBICTest::Schema::Artist', 'id2', { on_delete => undef, on_update => 'CASCADE'} ); __PACKAGE__->has_many( 'mapped_artists', 'DBICTest::Schema::Artist', [ {'foreign.artistid' => 'self.id1'}, {'foreign.artistid' => 'self.id2'} ], ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Schema/FourKeys_to_TwoKeys.pm0000644000175000017500000000163013556035664030547 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::FourKeys_to_TwoKeys; use base 'DBIx::Class::Core'; __PACKAGE__->table('fourkeys_to_twokeys'); __PACKAGE__->add_columns( 'f_foo' => { data_type => 'integer' }, 'f_bar' => { data_type => 'integer' }, 'f_hello' => { data_type => 'integer' }, 'f_goodbye' => { data_type => 'integer' }, 't_artist' => { data_type => 'integer' }, 't_cd' => { data_type => 'integer' }, 'autopilot' => { data_type => 'character' }, ); __PACKAGE__->set_primary_key( qw/f_foo f_bar f_hello f_goodbye t_artist t_cd/ ); __PACKAGE__->belongs_to('fourkeys', 'DBICTest::Schema::FourKeys', { 'foreign.foo' => 'self.f_foo', 'foreign.bar' => 'self.f_bar', 'foreign.hello' => 'self.f_hello', 'foreign.goodbye' => 'self.f_goodbye', }); __PACKAGE__->belongs_to('twokeys', 'DBICTest::Schema::TwoKeys', { 'foreign.artist' => 'self.t_artist', 'foreign.cd' => 'self.t_cd', }); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Plain/0000775000175000017500000000000013556035664024120 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/Plain/Test.pm0000644000175000017500000000045613556035664025400 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Plain::Test; use base 'DBIx::Class::Core'; __PACKAGE__->table('test'); __PACKAGE__->add_columns( 'id' => { data_type => 'integer', is_auto_increment => 1 }, 'name' => { data_type => 'varchar', }, ); __PACKAGE__->set_primary_key('id'); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/FakeComponent.pm0000644000175000017500000000020413556035664026136 0ustar ahartmaiahartmai# belongs to t/run/90ensure_class_loaded.tl package # hide from PAUSE DBICTest::FakeComponent; use warnings; use strict; 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/ErrorComponent.pm0000644000175000017500000000024413556035664026365 0ustar ahartmaiahartmai# belongs to t/run/90ensure_class_loaded.tl package # hide from PAUSE DBICTest::ErrorComponent; use warnings; use strict; # this is missing on purpose # 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/ForeignComponent.pm0000644000175000017500000000034413556035664026666 0ustar ahartmaiahartmai# belongs to t/05components.t package # hide from PAUSE DBICTest::ForeignComponent; use warnings; use strict; use base qw/ DBIx::Class /; __PACKAGE__->load_components( qw/ +DBICTest::ForeignComponent::TestComp / ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/ResultSetManager.pm0000644000175000017500000000020213556035664026630 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::ResultSetManager; use base 'DBIx::Class::Schema'; __PACKAGE__->load_classes("Foo"); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/OptionalComponent.pm0000644000175000017500000000021013556035664027052 0ustar ahartmaiahartmai# belongs to t/run/90ensure_class_loaded.tl package # hide from PAUSE DBICTest::OptionalComponent; use warnings; use strict; 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/ResultSetManager/0000775000175000017500000000000013556035664026302 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/ResultSetManager/Foo.pm0000644000175000017500000000032313556035664027357 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::ResultSetManager::Foo; use base 'DBIx::Class'; __PACKAGE__->load_components(qw/ ResultSetManager Core /); __PACKAGE__->table('foo'); sub bar : ResultSet { 'good' } 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/SyntaxErrorComponent3.pm0000644000175000017500000000010613556035664027654 0ustar ahartmaiahartmaipackage DBICErrorTest::SyntaxError; use strict; I'm a syntax error! DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/SyntaxErrorComponent2.pm0000644000175000017500000000025013556035664027653 0ustar ahartmaiahartmai# belongs to t/run/90ensure_class_loaded.tl package # hide from PAUSE DBICTest::SyntaxErrorComponent2; use warnings; use strict; my $str ''; # syntax error 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/SyntaxErrorComponent1.pm0000644000175000017500000000025013556035664027652 0ustar ahartmaiahartmai# belongs to t/run/90ensure_class_loaded.tl package # hide from PAUSE DBICTest::SyntaxErrorComponent1; use warnings; use strict; my $str ''; # syntax error 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/ForeignComponent/0000775000175000017500000000000013556035664026331 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBICTest/ForeignComponent/TestComp.pm0000644000175000017500000000024113556035664030420 0ustar ahartmaiahartmai# belongs to t/05components.t package # hide from PAUSE DBICTest::ForeignComponent::TestComp; use warnings; use strict; sub foreign_test_method { 1 } 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBSchema/0000775000175000017500000000000013556035664023122 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBSchema/Result/0000775000175000017500000000000013556035664024400 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBSchema/Result/Tag.pm0000644000175000017500000000132513556035664025450 0ustar ahartmaiahartmaipackage DBSchema::Result::Tag; # Created by DBIx::Class::Schema::Loader v0.03000 @ 2006-10-02 08:24:09 use strict; use warnings; use base 'DBIx::Class'; use overload '""' => sub {$_[0]->name}, fallback => 1; __PACKAGE__->load_components("PK::Auto", "Core"); __PACKAGE__->table("tag"); __PACKAGE__->add_columns( "id" => { data_type => 'integer', is_auto_increment => 1 }, 'name' => { data_type => 'varchar', size => 100, is_nullable => 1, }, 'file' => { data_type => 'text', is_nullable => 1, } ); __PACKAGE__->set_primary_key("id"); __PACKAGE__->has_many("dvdtags", "Dvdtag", { "foreign.tag" => "self.id" }); __PACKAGE__->many_to_many('dvds', 'dvdtags' => 'dvd'); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBSchema/Result/Dvd.pm0000644000175000017500000000366313556035664025461 0ustar ahartmaiahartmaipackage DBSchema::Result::Dvd; # Created by DBIx::Class::Schema::Loader v0.03000 @ 2006-10-02 08:24:09 use strict; use warnings; use base 'DBIx::Class'; use overload '""' => sub {$_[0]->name}, fallback => 1; __PACKAGE__->load_components(qw/IntrospectableM2M Core/); __PACKAGE__->table('dvd'); __PACKAGE__->add_columns( 'dvd_id' => { data_type => 'integer', is_auto_increment => 1 }, 'name' => { data_type => 'varchar', size => 100, is_nullable => 1, }, 'imdb_id' => { data_type => 'varchar', size => 100, is_nullable => 1, }, 'owner' => { data_type => 'integer' }, 'current_borrower' => { data_type => 'integer', is_nullable => 1, }, 'creation_date' => { data_type => 'datetime', is_nullable => 1, }, 'alter_date' => { data_type => 'datetime', is_nullable => 1, }, 'twokeysfk' => { data_type => 'integer', is_nullable => 1, }, ); __PACKAGE__->set_primary_key('dvd_id'); __PACKAGE__->belongs_to('owner', 'DBSchema::Result::User', 'owner'); __PACKAGE__->belongs_to('current_borrower', 'DBSchema::Result::User', 'current_borrower', { join_type => "LEFT" }); __PACKAGE__->has_many('dvdtags', 'Dvdtag', { 'foreign.dvd' => 'self.dvd_id' }); __PACKAGE__->has_many('viewings', 'DBSchema::Result::Viewing', { 'foreign.dvd_id' => 'self.dvd_id' }); __PACKAGE__->many_to_many('tags', 'dvdtags' => 'tag'); __PACKAGE__->might_have( liner_notes => 'DBSchema::Result::LinerNotes', undef, { proxy => [ qw/notes/ ] }, ); __PACKAGE__->add_relationship('like_has_many', 'DBSchema::Result::Twokeys', { 'foreign.dvd_name' => 'self.name' }, { accessor => 'multi', accessor_name => 'like_has_many' } ); __PACKAGE__->add_relationship('like_has_many2', 'DBSchema::Result::Twokeys_belongsto', { 'foreign.key1' => 'self.twokeysfk' }, { accessor => 'multi' }, ); __PACKAGE__->has_many( keysbymethod => 'KeysByMethod', { 'foreign.dvd' => 'self.dvd_id' } ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBSchema/Result/Role.pm0000644000175000017500000000124013556035664025632 0ustar ahartmaiahartmaipackage DBSchema::Result::Role; # Created by DBIx::Class::Schema::Loader v0.03000 @ 2006-10-02 08:24:09 use strict; use warnings; use base 'DBIx::Class'; use overload '""' => sub {$_[0]->id}, fallback => 1; __PACKAGE__->load_components("PK::Auto", "Core"); __PACKAGE__->table("role"); __PACKAGE__->add_columns( "id" => { data_type => 'integer', is_auto_increment => 1, }, "role" => { data_type => 'varchar', size => '100', } ); __PACKAGE__->set_primary_key("id"); __PACKAGE__->has_many("user_roles", "UserRole", { "foreign.role" => "self.id" }); __PACKAGE__->many_to_many('users', 'user_roles' => 'user'); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBSchema/Result/User.pm0000644000175000017500000000215413556035664025654 0ustar ahartmaiahartmaipackage DBSchema::Result::User; # Created by DBIx::Class::Schema::Loader v0.03000 @ 2006-10-02 08:24:09 use strict; use warnings; use base 'DBIx::Class'; #use overload '""' => sub {$_[0]->name}, fallback => 1; __PACKAGE__->load_components('Core'); __PACKAGE__->table("usr"); __PACKAGE__->add_columns( "id" => { data_type => 'integer', is_auto_increment => 1, }, "username" => { data_type => 'varchar', size => '100', }, "password" => { data_type => 'varchar', size => '100', }, "name" => { data_type => 'varchar', size => '100', }, ); __PACKAGE__->set_primary_key("id"); __PACKAGE__->has_many("user_roles", "UserRole", { "foreign.user" => "self.id" }); __PACKAGE__->has_many("owned_dvds", "Dvd", { "foreign.owner" => "self.id" }); __PACKAGE__->has_many( "borrowed_dvds", "Dvd", { "foreign.current_borrower" => "self.id" }, ); __PACKAGE__->many_to_many('roles', 'user_roles' => 'role'); __PACKAGE__->might_have( "address", "DBSchema::Result::Address", { 'foreign.user_id' => 'self.id' } ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBSchema/Result/Owner.pm0000644000175000017500000000150313556035664026025 0ustar ahartmaiahartmaipackage DBSchema::Result::Owner; use strict; use warnings; use base 'DBIx::Class::Core'; =head1 NAME DBSchema::Result::Owner =cut __PACKAGE__->table("owner"); =head1 ACCESSORS =head2 id data_type: 'integer' is_auto_increment: 1 is_nullable: 0 =head2 name data_type: 'text' default_value: (empty string) is_nullable: 0 =cut __PACKAGE__->add_columns( "id", { data_type => "integer", is_auto_increment => 1, is_nullable => 0 }, "name", { data_type => "text", default_value => "", is_nullable => 0 }, ); __PACKAGE__->set_primary_key("id"); =head1 RELATIONS =head2 podcasts Type: has_many Related object: L =cut __PACKAGE__->has_many( "podcasts", "DBSchema::Result::Podcast", { "foreign.owner_id" => "self.id" }, { cascade_copy => 0, cascade_delete => 0 }, ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBSchema/Result/Onekey.pm0000644000175000017500000000104713556035664026170 0ustar ahartmaiahartmaipackage DBSchema::Result::Onekey; # Created by DBIx::Class::Schema::Loader v0.03000 @ 2006-10-02 08:24:09 use strict; use warnings; use base 'DBIx::Class'; __PACKAGE__->load_components("PK::Auto", "Core"); __PACKAGE__->table("onekey"); __PACKAGE__->add_columns( "id" => { data_type => 'integer', is_auto_increment => 1 }, name => { data_type => 'varchar', size => 100, is_nullable => 1 }, ); __PACKAGE__->set_primary_key("id"); __PACKAGE__->might_have( twokeys_belongsto => 'DBSchema::Result::Twokeys_belongsto', 'key1', ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBSchema/Result/Dvdtag.pm0000644000175000017500000000105013556035664026141 0ustar ahartmaiahartmaipackage DBSchema::Result::Dvdtag; # Created by DBIx::Class::Schema::Loader v0.03000 @ 2006-10-02 08:24:09 use strict; use warnings; use base 'DBIx::Class'; __PACKAGE__->load_components("PK::Auto", "Core"); __PACKAGE__->table("dvdtag"); __PACKAGE__->add_columns( "dvd" => { data_type => 'integer' }, "tag" => { data_type => 'integer' }, ); __PACKAGE__->set_primary_key("dvd", "tag"); __PACKAGE__->belongs_to("dvd", "DBSchema::Result::Dvd", { dvd_id => "dvd" }); __PACKAGE__->belongs_to("tag", "DBSchema::Result::Tag", { id => "tag" }); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBSchema/Result/Viewing.pm0000755000175000017500000000073413556035664026353 0ustar ahartmaiahartmaipackage DBSchema::Result::Viewing; use base 'DBIx::Class::Core'; __PACKAGE__->table('viewing'); __PACKAGE__->add_columns( 'user_id' => { data_type => 'integer' }, 'dvd_id' => { data_type => 'integer' }, ); __PACKAGE__->set_primary_key(qw/user_id dvd_id/); __PACKAGE__->belongs_to( user => 'DBSchema::Result::User', {'foreign.id'=>'self.user_id'}, ); __PACKAGE__->belongs_to( dvd => 'DBSchema::Result::Dvd', {'foreign.dvd_id'=>'self.dvd_id'}, ); ; 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBSchema/Result/Address.pm0000644000175000017500000000126113556035664026321 0ustar ahartmaiahartmaipackage DBSchema::Result::Address; use strict; use warnings; use base 'DBIx::Class'; __PACKAGE__->load_components("Core"); __PACKAGE__->table("address"); __PACKAGE__->add_columns( "address_id", { data_type => "INTEGER", is_auto_increment => 1, is_nullable => 0 }, "user_id", { data_type => "INTEGER", is_nullable => 0 }, "street", { data_type => "VARCHAR", is_nullable => 0, size => 32 }, "city", { data_type => "VARCHAR", is_nullable => 0, size => 32 }, "state", { data_type => "VARCHAR", is_nullable => 0, size => 32 }, ); __PACKAGE__->set_primary_key("address_id"); __PACKAGE__->belongs_to( 'user', 'DBSchema::Result::User', 'user_id', ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBSchema/Result/Podcast.pm0000644000175000017500000000205613556035664026334 0ustar ahartmaiahartmaipackage DBSchema::Result::Podcast; use strict; use warnings; use base 'DBIx::Class::Core'; =head1 NAME testDB::Schema::Result::Dvd =cut __PACKAGE__->table("podcast"); =head1 ACCESSORS =head2 id data_type: 'integer' is_auto_increment: 1 is_nullable: 0 =head2 title data_type: 'text' default_value: (empty string) is_nullable: 0 =head2 owner_id data_type: 'integer' is_foreign_key: 1 is_nullable: 1 =cut __PACKAGE__->add_columns( "id", { data_type => "integer", is_auto_increment => 1, is_nullable => 0 }, "title", { data_type => "text", default_value => "", is_nullable => 0 }, "owner_id", { data_type => "integer", is_foreign_key => 1, is_nullable => 1 }, ); __PACKAGE__->set_primary_key("id"); =head1 RELATIONS =head2 owner Type: belongs_to Related object: L =cut __PACKAGE__->belongs_to( "owner", "DBSchema::Result::Owner", { id => "owner_id" }, { is_deferrable => 1, join_type => "LEFT", on_delete => "CASCADE", on_update => "CASCADE", }, ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBSchema/Result/Twokeys.pm0000644000175000017500000000106713556035664026405 0ustar ahartmaiahartmaipackage DBSchema::Result::Twokeys; # Created by DBIx::Class::Schema::Loader v0.03000 @ 2006-10-02 08:24:09 use strict; use warnings; use base 'DBIx::Class'; __PACKAGE__->load_components("PK::Auto", "Core"); __PACKAGE__->table("twokeys"); __PACKAGE__->add_columns( "dvd_name" => { data_type => 'varchar', size => 100 }, "key2" => { data_type => 'integer' }, ); __PACKAGE__->set_primary_key("dvd_name", "key2"); __PACKAGE__->add_relationship('like_belongs_to', 'DBSchema::Result::Dvd', { 'foreign.name' => 'self.dvd_name' }, { accessor => 'single' }); 1;DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBSchema/Result/UserRole.pm0000644000175000017500000000102013556035664026465 0ustar ahartmaiahartmaipackage DBSchema::Result::UserRole; # Created by DBIx::Class::Schema::Loader v0.03000 @ 2006-10-02 08:24:09 use strict; use warnings; use base 'DBIx::Class'; __PACKAGE__->load_components("PK::Auto", "Core"); __PACKAGE__->table("user_role"); __PACKAGE__->add_columns( "user" => { data_type => 'integer' } , "role" => { data_type => 'integer' } ); __PACKAGE__->set_primary_key("user", "role"); __PACKAGE__->belongs_to("user", "User", { id => "user" }); __PACKAGE__->belongs_to("role", "Role", { id => "role" }); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBSchema/Result/LinerNotes.pm0000644000175000017500000000061713556035664027022 0ustar ahartmaiahartmaipackage # hide from PAUSE DBSchema::Result::LinerNotes; use base qw/DBIx::Class::Core/; __PACKAGE__->table('liner_notes'); __PACKAGE__->add_columns( 'liner_id' => { data_type => 'integer', }, 'notes' => { data_type => 'varchar', size => 100, }, ); __PACKAGE__->set_primary_key('liner_id'); __PACKAGE__->belongs_to( 'dvd', 'DBSchema::Result::Dvd', 'liner_id' ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBSchema/Result/Personality.pm0000644000175000017500000000060413556035664027245 0ustar ahartmaiahartmaipackage DBSchema::Result::Personality; use strict; use warnings; use base 'DBIx::Class'; __PACKAGE__->load_components( "PK::Auto", "Core" ); __PACKAGE__->table("personality"); __PACKAGE__->add_columns( "user_id" => { data_type => 'integer' }, ); __PACKAGE__->set_primary_key("user_id"); __PACKAGE__->has_one( 'user', 'DBSchema::Result::User', {'foreign.id' => 'self.user_id'}, ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBSchema/Result/KeysByMethod.pm0000644000175000017500000000445413556035664027312 0ustar ahartmaiahartmaipackage DBSchema::Result::KeysByMethod; use strict; use warnings; use parent 'DBIx::Class::Core'; # # use Moose; # use MooseX::NonMoose; # extends 'DBIx::Class::Core'; __PACKAGE__->table("keysbymethod"); __PACKAGE__->add_columns( "dvd" => { data_type => 'integer' }, "key1" => { data_type => 'varchar', size => 16 }, "key2" => { data_type => 'varchar', size => 16 }, "value" => { data_type => 'varchar', size => 16, is_nullable => 1 }, ); __PACKAGE__->set_primary_key("dvd", "key1", "key2"); __PACKAGE__->belongs_to("dvd", "DBSchema::Result::Dvd", { dvd_id => "dvd" }); sub new { my $class = shift; my $attrs = shift; # remove non-column attribute 'combined_key' if ( ref $attrs eq 'HASH' && exists $attrs->{combined_key}) { # copy to avoid side effects cause by modifying the input params my %foreign_attrs = %$attrs; my $combined_key = delete $foreign_attrs{combined_key}; my ($key1, $key2) = split('/', $combined_key); $foreign_attrs{key1} = $key1; $foreign_attrs{key2} = $key2; return $class->SUPER::new(\%foreign_attrs, @_); } return $class->SUPER::new($attrs, @_); } # sub FOREIGNBUILDARGS { # my $class = shift; # my $attrs = shift; # # # remove non-column attribute 'combined_key' # if ( ref $attrs eq 'HASH' ) { # # copy to avoid side effects cause by modifying the input params # my %foreign_attrs = %$attrs; # delete $foreign_attrs{combined_key}; # return \%foreign_attrs, @_; # } # # return $attrs, @_; # } # has combined_key => ( # is => 'rw', # lazy => 1, # default => sub { # my $self = shift; # return $self->key1 . '/' . $self->key2 # if defined $self->key1 && defined $self->key2; # return; # }, # trigger => sub { # my ( $self, $combined_value ) = @_; # my ($key1, $key2) = split('/', $combined_value); # $self->key1($key1); # $self->key2($key2); # } # ); sub combined_key { my $self = shift; if (@_) { my $combined_value = shift; my ($key1, $key2) = split('/', $combined_value); $self->key1($key1); $self->key2($key2); } return $self->key1 . '/' . $self->key2; } # no Moose; # __PACKAGE__->meta->make_immutable; 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBSchema/Result/Twokeys_belongsto.pm0000644000175000017500000000120213556035664030450 0ustar ahartmaiahartmaipackage DBSchema::Result::Twokeys_belongsto; # Created by DBIx::Class::Schema::Loader v0.03000 @ 2006-10-02 08:24:09 use strict; use warnings; use base 'DBIx::Class'; __PACKAGE__->load_components("PK::Auto", "Core"); __PACKAGE__->table("twokeys_belongsto"); __PACKAGE__->add_columns( "key1" => { data_type => 'integer' }, "key2" => { data_type => 'integer' }, ); __PACKAGE__->set_primary_key("key1", "key2"); __PACKAGE__->add_relationship('like_belongs_to', 'DBSchema::Result::Dvd', { 'foreign.twokeysfk' => 'self.key1' }, ); __PACKAGE__->belongs_to('onekey', 'DBSchema::Result::Onekey', { 'foreign.id' => 'self.key1' }, ); 1;DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/TwoPkHasManyDB/0000775000175000017500000000000013556035664024247 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/TwoPkHasManyDB/Schema.pm0000644000175000017500000000025713556035664026007 0ustar ahartmaiahartmaipackage TwoPkHasManyDB::Schema; use base 'DBIx::Class::Schema'; __PACKAGE__->load_namespaces( default_resultset_class => '+DBIx::Class::ResultSet::RecursiveUpdate' ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/TwoPkHasManyDB/Schema/0000775000175000017500000000000013556035664025447 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/TwoPkHasManyDB/Schema/Result/0000775000175000017500000000000013556035664026725 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/TwoPkHasManyDB/Schema/Result/Item.pm0000644000175000017500000000124713556035664030163 0ustar ahartmaiahartmaipackage TwoPkHasManyDB::Schema::Result::Item; use strict; use warnings; use base 'DBIx::Class'; __PACKAGE__->load_components("Core"); __PACKAGE__->table("item"); __PACKAGE__->add_columns( "id", { data_type => "INTEGER", is_auto_increment => 1, is_nullable => 0 }, ); __PACKAGE__->set_primary_key("id"); __PACKAGE__->has_many( "relateditems", "TwoPkHasManyDB::Schema::Result::RelatedItem", { "foreign.item_id" => "self.id" }, { cascade_copy => 0, cascade_delete => 0 }, ); __PACKAGE__->has_many( "relateditems2", "TwoPkHasManyDB::Schema::Result::RelatedItem2", { "foreign.item_id" => "self.id" }, { cascade_copy => 0, cascade_delete => 0 }, ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/TwoPkHasManyDB/Schema/Result/RelatedItem.pm0000644000175000017500000000103613556035664031460 0ustar ahartmaiahartmaipackage TwoPkHasManyDB::Schema::Result::RelatedItem; use strict; use warnings; use base 'DBIx::Class'; __PACKAGE__->load_components("Core"); __PACKAGE__->table("relateditem"); __PACKAGE__->add_columns( "id", { data_type => "INTEGER", is_auto_increment => 1, is_nullable => 0 }, "item_id", { data_type => "integer", is_foreign_key => 1, is_nullable => 0, }, ); __PACKAGE__->belongs_to( 'item', 'TwoPkHasManyDB::Schema::Result::Item', { id => 'item_id'}, ); __PACKAGE__->set_primary_key("id", "item_id"); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/TwoPkHasManyDB/Schema/Result/RelatedItem2.pm0000644000175000017500000000104613556035664031543 0ustar ahartmaiahartmaipackage TwoPkHasManyDB::Schema::Result::RelatedItem2; use strict; use warnings; use base 'DBIx::Class'; __PACKAGE__->load_components("Core"); __PACKAGE__->table("relateditem2"); __PACKAGE__->add_columns( "idcol", { data_type => "INTEGER", is_auto_increment => 1, is_nullable => 0 }, "item_id", { data_type => "integer", is_foreign_key => 1, is_nullable => 0, }, ); __PACKAGE__->belongs_to( 'item', 'TwoPkHasManyDB::Schema::Result::Item', { id => 'item_id'}, ); __PACKAGE__->set_primary_key("idcol", "item_id"); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBSchemaMoose/0000775000175000017500000000000013556035664024125 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/DBSchemaMoose/ResultSet.pm0000644000175000017500000000031713556035664026414 0ustar ahartmaiahartmaipackage DBSchemaMoose::ResultSet; use namespace::autoclean; use Moose; use MooseX::NonMoose; extends qw/DBIx::Class::ResultSet::RecursiveUpdate DBIx::Class::ResultSet/; __PACKAGE__->meta->make_immutable; 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/AnotherTestDB/0000775000175000017500000000000013556035664024162 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/AnotherTestDB/OnePK/0000775000175000017500000000000013556035664025136 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/AnotherTestDB/OnePK/Schema.pm0000644000175000017500000000026513556035664026675 0ustar ahartmaiahartmaipackage AnotherTestDB::OnePK::Schema; use base 'DBIx::Class::Schema'; __PACKAGE__->load_namespaces( default_resultset_class => '+DBIx::Class::ResultSet::RecursiveUpdate' ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/AnotherTestDB/OnePK/Schema/0000775000175000017500000000000013556035664026336 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/AnotherTestDB/OnePK/Schema/Result/0000775000175000017500000000000013556035664027614 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/AnotherTestDB/OnePK/Schema/Result/Item.pm0000644000175000017500000000143413556035664031050 0ustar ahartmaiahartmaipackage AnotherTestDB::OnePK::Schema::Result::Item; use strict; use warnings; use base 'DBIx::Class'; __PACKAGE__->load_components("Core"); __PACKAGE__->table("item"); __PACKAGE__->add_columns( "idcol", { data_type => "INTEGER", is_auto_increment => 1, is_nullable => 0 }, ); __PACKAGE__->set_primary_key("idcol"); __PACKAGE__->has_many( "relateditems", "AnotherTestDB::OnePK::Schema::Result::RelatedItem", { "foreign.item_id" => "self.idcol" }, { cascade_copy => 0, cascade_delete => 0 }, ); __PACKAGE__->has_many( "true_relateditems", "AnotherTestDB::OnePK::Schema::Result::RelatedItem", { "foreign.item_id" => "self.idcol" }, {where => { 'conditionitems.condition' => 'true'}, 'join' => qq/conditionitems/, cascade_copy => 0, cascade_delete => 0 }, ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/AnotherTestDB/OnePK/Schema/Result/RelatedItem.pm0000644000175000017500000000135613556035664032354 0ustar ahartmaiahartmaipackage AnotherTestDB::OnePK::Schema::Result::RelatedItem; use strict; use warnings; use base 'DBIx::Class'; __PACKAGE__->load_components("Core"); __PACKAGE__->table("relateditem"); __PACKAGE__->add_columns( "idcol", { data_type => "INTEGER", is_auto_increment => 1, is_nullable => 0 }, "item_id", { data_type => "integer", is_foreign_key => 1, is_nullable => 0, }, ); __PACKAGE__->belongs_to( 'item', 'AnotherTestDB::OnePK::Schema::Result::Item', { idcol => 'item_id'}, ); __PACKAGE__->has_many( "conditionitems", "AnotherTestDB::OnePK::Schema::Result::ConditionItem", { "foreign.rel_item_id" => "self.idcol" }, { cascade_copy => 0, cascade_delete => 0 }, ); __PACKAGE__->set_primary_key("idcol"); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/AnotherTestDB/OnePK/Schema/Result/ConditionItem.pm0000644000175000017500000000117113556035664032715 0ustar ahartmaiahartmaipackage AnotherTestDB::OnePK::Schema::Result::ConditionItem; use strict; use warnings; use base 'DBIx::Class'; __PACKAGE__->load_components("Core"); __PACKAGE__->table("conditionitem"); __PACKAGE__->add_columns( "idcol", { data_type => "INTEGER", is_auto_increment => 1, is_nullable => 0 }, "condition", { data_type => "TEXT", is_nullable => 0 }, "rel_item_id", { data_type => "integer", is_foreign_key => 1, is_nullable => 0, }, ); __PACKAGE__->belongs_to( 'related_item', 'AnotherTestDB::OnePK::Schema::Result::RelatedItem', { idcol => 'rel_item_id'}, ); __PACKAGE__->set_primary_key("idcol"); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/AnotherTestDB/TwoPK/0000775000175000017500000000000013556035664025166 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/AnotherTestDB/TwoPK/Schema.pm0000644000175000017500000000026513556035664026725 0ustar ahartmaiahartmaipackage AnotherTestDB::TwoPK::Schema; use base 'DBIx::Class::Schema'; __PACKAGE__->load_namespaces( default_resultset_class => '+DBIx::Class::ResultSet::RecursiveUpdate' ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/AnotherTestDB/TwoPK/Schema/0000775000175000017500000000000013556035664026366 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/AnotherTestDB/TwoPK/Schema/Result/0000775000175000017500000000000013556035664027644 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/AnotherTestDB/TwoPK/Schema/Result/Item.pm0000644000175000017500000000143413556035664031100 0ustar ahartmaiahartmaipackage AnotherTestDB::TwoPK::Schema::Result::Item; use strict; use warnings; use base 'DBIx::Class'; __PACKAGE__->load_components("Core"); __PACKAGE__->table("item"); __PACKAGE__->add_columns( "idcol", { data_type => "INTEGER", is_auto_increment => 1, is_nullable => 0 }, ); __PACKAGE__->set_primary_key("idcol"); __PACKAGE__->has_many( "relateditems", "AnotherTestDB::TwoPK::Schema::Result::RelatedItem", { "foreign.item_id" => "self.idcol" }, { cascade_copy => 0, cascade_delete => 0 }, ); __PACKAGE__->has_many( "true_relateditems", "AnotherTestDB::TwoPK::Schema::Result::RelatedItem", { "foreign.item_id" => "self.idcol" }, {where => { 'conditionitems.condition' => 'true'}, 'join' => qq/conditionitems/, cascade_copy => 0, cascade_delete => 0 }, ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/AnotherTestDB/TwoPK/Schema/Result/RelatedItem.pm0000644000175000017500000000137113556035664032401 0ustar ahartmaiahartmaipackage AnotherTestDB::TwoPK::Schema::Result::RelatedItem; use strict; use warnings; use base 'DBIx::Class'; __PACKAGE__->load_components("Core"); __PACKAGE__->table("relateditem"); __PACKAGE__->add_columns( "idcol", { data_type => "INTEGER", is_auto_increment => 1, is_nullable => 0 }, "item_id", { data_type => "integer", is_foreign_key => 1, is_nullable => 0, }, ); __PACKAGE__->belongs_to( 'item', 'AnotherTestDB::TwoPK::Schema::Result::Item', { idcol => 'item_id'}, ); __PACKAGE__->has_many( "conditionitems", "AnotherTestDB::TwoPK::Schema::Result::ConditionItem", { "foreign.rel_item_id" => "self.idcol" }, { cascade_copy => 0, cascade_delete => 0 }, ); __PACKAGE__->set_primary_key("idcol", "item_id"); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/lib/AnotherTestDB/TwoPK/Schema/Result/ConditionItem.pm0000644000175000017500000000117113556035664032745 0ustar ahartmaiahartmaipackage AnotherTestDB::TwoPK::Schema::Result::ConditionItem; use strict; use warnings; use base 'DBIx::Class'; __PACKAGE__->load_components("Core"); __PACKAGE__->table("conditionitem"); __PACKAGE__->add_columns( "idcol", { data_type => "INTEGER", is_auto_increment => 1, is_nullable => 0 }, "condition", { data_type => "TEXT", is_nullable => 0 }, "rel_item_id", { data_type => "integer", is_foreign_key => 1, is_nullable => 0, }, ); __PACKAGE__->belongs_to( 'related_item', 'AnotherTestDB::TwoPK::Schema::Result::RelatedItem', { idcol => 'rel_item_id'}, ); __PACKAGE__->set_primary_key("idcol"); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/twopk_has_many.t0000644000175000017500000000247413556035664024163 0ustar ahartmaiahartmaiuse strict; use warnings; use lib 't/lib'; use Test::More; use Test::Exception; use_ok 'TwoPkHasManyDB::Schema'; my $schema = TwoPkHasManyDB::Schema->connect('dbi:SQLite:dbname=:memory:'); isa_ok $schema, 'DBIx::Class::Schema'; lives_ok( sub{ $schema->deploy(); $schema->populate('Item', [ [ qw/id/ ], [ 1 ], ]); $schema->populate('RelatedItem', [ [ qw/id item_id/ ], [ 1, 1 ], [ 2, 1 ], ]); $schema->populate('RelatedItem2', [ [ qw/idcol item_id/ ], [ 1, 1 ], [ 2, 1 ], ]); }, 'creating and populating test database' ); is($schema->resultset('Item')->find({ id =>1})->relateditems->count, 2); is($schema->resultset('Item')->find({ id =>1})->relateditems2->count, 2); # this one will fail for unpatched RecursiveUpdate lives_ok(sub{ $schema->resultset('Item')->recursive_update({ id => 1, relateditems => [{ id => 1, item_id => 1, }], }); }, "updating two_pk relation with colname id"); # this works fine, even with unpatched RecursiveUpdate lives_ok(sub{ $schema->resultset('Item')->recursive_update({ id => 1, relateditems2 => [{ idcol => 1, item_id => 1, }], }); }, "updating two_pk relation without colname id"); is($schema->resultset('Item')->find({ id =>1})->relateditems->count, 1); is($schema->resultset('Item')->find({ id =>1})->relateditems2->count, 1); done_testing; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/96multi_create.t0000644000175000017500000007577313556035664024010 0ustar ahartmaiahartmaiuse strict; use warnings; use Test::More; use Test::Exception; use lib qw(t/lib); use DBICTest; my $schema = DBICTest->init_schema(); diag '* simple create + parent (the stuff $rs belongs_to)'; eval { my $cd = $schema->resultset('CD')->recursive_update( { artist => { name => 'Fred Bloggs' }, title => 'Some CD', year => 1996 } ); isa_ok( $cd, 'DBICTest::CD', 'Created CD object' ); isa_ok( $cd->artist, 'DBICTest::Artist', 'Created related Artist' ); is( $cd->artist->name, 'Fred Bloggs', 'Artist created correctly' ); }; diag $@ if $@; diag '* same as above but the child and parent have no values, except for an explicit parent pk'; eval { my $bm_rs = $schema->resultset('Bookmark'); my $bookmark = $bm_rs->recursive_update( { link => { id => 66, }, } ); isa_ok( $bookmark, 'DBICTest::Bookmark', 'Created Bookrmark object' ); isa_ok( $bookmark->link, 'DBICTest::Link', 'Created related Link' ); is( $bm_rs->search( { 'link.title' => $bookmark->link->title }, { join => 'link' }, )->count, 1, 'Bookmark and link made it to the DB', ); }; diag $@ if $@; diag '* Create m2m while originating in the linker table'; eval { my $artist = $schema->resultset('Artist')->first; my $c2p = $schema->resultset('CD_to_Producer')->recursive_update( { cd => { artist => $artist, title => 'Bad investment', year => 2008, tracks => [ { pos => 1, title => 'Just buy' }, { pos => 2, title => 'Why did we do it' }, { pos => 3, title => 'Burn baby burn' }, ], }, producer => { name => 'Lehman Bros.', }, } ); isa_ok( $c2p, 'DBICTest::CD_to_Producer', 'Linker object created' ); my $prod = $schema->resultset('Producer')->find( { name => 'Lehman Bros.' } ); isa_ok( $prod, 'DBICTest::Producer', 'Producer row found' ); is( $prod->cds->count, 1, 'Producer has one production' ); my $cd = $prod->cds->first; is( $cd->title, 'Bad investment', 'CD created correctly' ); is( $cd->tracks->count, 3, 'CD has 3 tracks' ); }; diag $@ if $@; diag(<<'DG'); * Create over > 1 levels of might_have with multiple has_many and multiple m2m but starting at a has_many level CD -> has_many -> Tracks -> might have -> Single -> has_many -> Tracks \ \-> has_many \ --> CD2Producer /-> has_many / / Producer DG eval { my $artist = $schema->resultset('Artist')->first; my $cd = $schema->resultset('CD')->recursive_update( { artist => $artist, title => 'Music to code by at night', year => 2008, tracks => [ { pos => 1, # some day me might test this with Ordered title => 'Off by one again', }, { pos => 2, title => 'The dereferencer', cd_single => { artist => $artist, year => 2008, title => 'Was that a null (Single)', tracks => [ { title => 'The dereferencer', pos => 1 }, { title => 'The dereferencer II', pos => 2 }, ], cd_to_producer => [ { producer => { name => 'K&R', } }, { producer => { name => 'Don Knuth', } }, ] }, }, ], } ); isa_ok( $cd, 'DBICTest::CD', 'Main CD object created' ); is( $cd->title, 'Music to code by at night', 'Correct CD title' ); is( $cd->tracks->count, 2, 'Two tracks on main CD' ); my ( $t1, $t2 ) = $cd->tracks->all; is( $t1->title, 'Off by one again', 'Correct 1st track name' ); is( $t1->cd_single, undef, 'No single for 1st track' ); is( $t2->title, 'The dereferencer', 'Correct 2nd track name' ); isa_ok( $t2->cd_single, 'DBICTest::CD', 'Created a single for 2nd track' ); my $single = $t2->cd_single; is( $single->tracks->count, 2, 'Two tracks on single CD' ); is( $single->tracks->find( { position => 1 } )->title, 'The dereferencer', 'Correct 1st track title' ); is( $single->tracks->find( { position => 2 } )->title, 'The dereferencer II', 'Correct 2nd track title' ); is( $single->cd_to_producer->count, 2, 'Two producers created for the single cd' ); is_deeply( [ sort map { $_->producer->name } ( $single->cd_to_producer->all ) ], [ 'Don Knuth', 'K&R' ], 'Producers named correctly', ); }; diag $@ if $@; diag(<<'DG'); * Same as above but starting at the might_have directly Track -> might have -> Single -> has_many -> Tracks \ \-> has_many \ --> CD2Producer /-> has_many / / Producer DG eval { my $cd = $schema->resultset('CD')->first; my $track = $schema->resultset('Track')->recursive_update( { cd => $cd, pos => 77, # some day me might test this with Ordered title => 'Multicreate rocks', cd_single => { artist => $cd->artist, year => 2008, title => 'Disemboweling MultiCreate', tracks => [ { title => 'Why does mst write this way', pos => 1 }, { title => 'Chainsaw celebration', pos => 2 }, { title => 'Purl cleans up', pos => 3 }, ], cd_to_producer => [ { producer => { name => 'mst', } }, { producer => { name => 'castaway', } }, { producer => { name => 'theorbtwo', } }, ] }, } ); isa_ok( $track, 'DBICTest::Track', 'Main Track object created' ); is( $track->title, 'Multicreate rocks', 'Correct Track title' ); my $single = $track->cd_single; isa_ok( $single, 'DBICTest::CD', 'Created a single with the track' ); is( $single->tracks->count, 3, '3 tracks on single CD' ); is( $single->tracks->find( { position => 1 } )->title, 'Why does mst write this way', 'Correct 1st track title' ); is( $single->tracks->find( { position => 2 } )->title, 'Chainsaw celebration', 'Correct 2nd track title' ); is( $single->tracks->find( { position => 3 } )->title, 'Purl cleans up', 'Correct 3rd track title' ); is( $single->cd_to_producer->count, 3, '3 producers created for the single cd' ); is_deeply( [ sort map { $_->producer->name } ( $single->cd_to_producer->all ) ], [ 'castaway', 'mst', 'theorbtwo' ], 'Producers named correctly', ); }; diag $@ if $@; diag '* Test might_have again but with a PK == FK in the middle (obviously not specified)'; eval { my $artist = $schema->resultset('Artist')->first; my $cd = $schema->resultset('CD')->recursive_update( { artist => $artist, title => 'Music to code by at twilight', year => 2008, artwork => { images => [ { name => 'recursive descent' }, { name => 'tail packing' }, ], }, } ); isa_ok( $cd, 'DBICTest::CD', 'Main CD object created' ); is( $cd->title, 'Music to code by at twilight', 'Correct CD title' ); isa_ok( $cd->artwork, 'DBICTest::Artwork', 'Artwork created' ); # this test might look weird, but it failed at one point, keep it there my $art_obj = $cd->artwork; ok( $art_obj->has_column_loaded('cd_id'), 'PK/FK present on artwork object' ); is( $art_obj->images->count, 2, 'Correct artwork image count via the new object' ); is_deeply( [ sort $art_obj->images->get_column('name')->all ], [ 'recursive descent', 'tail packing' ], 'Images named correctly in objects', ); my $artwork = $schema->resultset('Artwork')->search( { 'cd.title' => 'Music to code by at twilight' }, { join => 'cd' }, )->single; is( $artwork->images->count, 2, 'Correct artwork image count via a new search' ); is_deeply( [ sort $artwork->images->get_column('name')->all ], [ 'recursive descent', 'tail packing' ], 'Images named correctly after search', ); }; diag $@ if $@; diag '* Test might_have again but with just a PK and FK (neither specified) in the mid-table'; eval { my $cd = $schema->resultset('CD')->first; my $track = $schema->resultset('Track')->recursive_update( { cd => $cd, pos => 66, title => 'Black', lyrics => { lyric_versions => [ { text => 'The color black' }, { text => 'The colour black' }, ], }, } ); isa_ok( $track, 'DBICTest::Track', 'Main track object created' ); is( $track->title, 'Black', 'Correct track title' ); isa_ok( $track->lyrics, 'DBICTest::Lyrics', 'Lyrics created' ); # this test might look weird, but it was failing at one point, keep it there my $lyric_obj = $track->lyrics; ok( $lyric_obj->has_column_loaded('lyric_id'), 'PK present on lyric object' ); ok( $lyric_obj->has_column_loaded('track_id'), 'FK present on lyric object' ); is( $lyric_obj->lyric_versions->count, 2, 'Correct lyric versions count via the new object' ); is_deeply( [ sort $lyric_obj->lyric_versions->get_column('text')->all ], [ 'The color black', 'The colour black' ], 'Lyrics text in objects matches', ); my $lyric = $schema->resultset('Lyrics') ->search( { 'track.title' => 'Black' }, { join => 'track' }, ) ->single; is( $lyric->lyric_versions->count, 2, 'Correct lyric versions count via a new search' ); is_deeply( [ sort $lyric->lyric_versions->get_column('text')->all ], [ 'The color black', 'The colour black' ], 'Lyrics text via search matches', ); }; diag $@ if $@; diag(<<'DG'); * Test a multilevel might-have with a PK == FK in the might_have/has_many table CD -> might have -> Artwork \ \-> has_many \ --> Artwork_to_Artist /-> has_many / / Artist DG eval { my $someartist = $schema->resultset('Artist')->first; my $cd = $schema->resultset('CD')->recursive_update( { artist => $someartist, title => 'Music to code by until the cows come home', year => 2008, artwork => { artwork_to_artist => [ { artist => { name => 'cowboy joe' } }, { artist => { name => 'billy the kid' } }, ], }, } ); isa_ok( $cd, 'DBICTest::CD', 'Main CD object created' ); is( $cd->title, 'Music to code by until the cows come home', 'Correct CD title' ); my $art_obj = $cd->artwork; ok( $art_obj->has_column_loaded('cd_id'), 'PK/FK present on artwork object' ); is( $art_obj->artists->count, 2, 'Correct artwork creator count via the new object' ); is_deeply( [ sort $art_obj->artists->get_column('name')->all ], [ 'billy the kid', 'cowboy joe' ], 'Artists named correctly when queried via object', ); my $artwork = $schema->resultset('Artwork')->search( { 'cd.title' => 'Music to code by until the cows come home' }, { join => 'cd' }, )->single; is( $artwork->artists->count, 2, 'Correct artwork creator count via a new search' ); is_deeply( [ sort $artwork->artists->get_column('name')->all ], [ 'billy the kid', 'cowboy joe' ], 'Artists named correctly queried via a new search', ); }; diag $@ if $@; diag '* Nested find_or_create'; eval { my $newartist2 = $schema->resultset('Artist')->recursive_update( { name => 'Fred 3', cds => [ { title => 'Noah Act', year => 2007, }, ], } ); is( $newartist2->name, 'Fred 3', 'Created new artist with cds via find_or_create' ); }; diag $@ if $@; diag '* Multiple same level has_many create'; eval { my $artist2 = $schema->resultset('Artist')->recursive_update( { name => 'Fred 4', cds => [ { title => 'Music to code by', year => 2007, }, ], cds_unordered => [ { title => 'Music to code by 1', # original title => 'Music to code by', year => 2007, }, ] } ); is( $artist2->in_storage, 1, 'artist with duplicate rels inserted okay' ); }; diag $@ if $@; diag '* First create_related pass'; eval { my $artist = $schema->resultset('Artist')->first; my $cd_result = $schema->resultset('CD')->recursive_update( { artist => $artist->artistid, title => 'TestOneCD1', year => 2007, tracks => [ { pos => 111, title => 'TrackOne', }, { pos => 112, title => 'TrackTwo', } ], } ); ok( $cd_result && ref $cd_result eq 'DBICTest::CD', "Got Good CD Class" ); ok( $cd_result->title eq "TestOneCD1", "Got Expected Title" ); my $tracks = $cd_result->tracks; ok( $tracks->isa("DBIx::Class::ResultSet"), "Got Expected Tracks ResultSet" ); foreach my $track ( $tracks->all ) { ok( $track && ref $track eq 'DBICTest::Track', 'Got Expected Track Class' ); } }; diag $@ if $@; diag '* second create_related with same arguments'; eval { my $artist = $schema->resultset('Artist')->first; my $cd_result = $schema->resultset('CD')->recursive_update( { artist => $artist->artistid, title => 'TestOneCD2', year => 2007, tracks => [ { pos => 111, title => 'TrackOne', }, { pos => 112, title => 'TrackTwo', } ], liner_notes => { notes => 'I can haz liner notes?' }, } ); ok( $cd_result && ref $cd_result eq 'DBICTest::CD', "Got Good CD Class" ); ok( $cd_result->title eq "TestOneCD2", "Got Expected Title" ); ok( $cd_result->notes eq 'I can haz liner notes?', 'Liner notes' ); my $tracks = $cd_result->tracks; ok( $tracks->isa("DBIx::Class::ResultSet"), "Got Expected Tracks ResultSet" ); foreach my $track ( $tracks->all ) { ok( $track && ref $track eq 'DBICTest::Track', 'Got Expected Track Class' ); } }; diag $@ if $@; diag '* create of parents of a record linker table'; eval { my $cdp = $schema->resultset('CD_to_Producer')->recursive_update( { cd => { artist => 1, title => 'foo', year => 2000 }, producer => { name => 'jorge' } } ); ok( $cdp, 'join table record created ok' ); }; diag $@ if $@; diag '* Create foreign key col obj including PK (See test 20 in 66relationships.t)'; eval { my $new_cd_hashref = { cdid => 27, title => 'Boogie Woogie', year => '2007', artist => { artistid => 17, name => 'king luke' } }; my $cd = $schema->resultset("CD")->find(1); is( $cd->artist->id, 1, 'rel okay' ); my $new_cd = $schema->resultset("CD")->recursive_update($new_cd_hashref); is( $new_cd->artist->id, 17, 'new id retained okay' ); }; diag $@ if $@; eval { $schema->resultset("CD")->recursive_update( { cdid => 28, title => 'Boogie Wiggle', year => '2007', artist => { artistid => 18, name => 'larry' } } ); }; is( $@, '', 'new cd created without clash on related artist' ); diag '* Test multi create over many_to_many'; eval { $schema->resultset('CD')->recursive_update( { artist => { name => 'larry', # should already exist }, title => 'Warble Marble', year => '2009', cd_to_producer => [ { producer => { name => 'Cowboy Neal' } }, ], } ); my $m2m_cd = $schema->resultset('CD')->search( { title => 'Warble Marble' } ); is( $m2m_cd->count, 1, 'One CD row created via M2M create' ); is( $m2m_cd->first->producers->count, 1, 'CD row created with one producer' ); is( $m2m_cd->first->producers->first->name, 'Cowboy Neal', 'Correct producer row created' ); }; diag '* And the insane multicreate'; # (should work, despite the fact that no one will probably use it this way) # first count how many rows do we initially have my $counts; $counts->{$_} = $schema->resultset($_)->count for qw/Artist CD Genre Producer Tag/; # do the crazy create eval { my $greatest_collections = $schema->resultset('Genre') ->create( { name => '"Greatest" collections' } ); my $greatest_collections2 = $schema->resultset('Genre') ->create( { name => '"Greatest" collections2' } ); $schema->resultset('CD')->recursive_update( { artist => { name => 'james', }, title => 'Greatest hits 1', year => '2012', genre => $greatest_collections, tags => [ { tag => 'A' }, { tag => 'B' }, ], cd_to_producer => [ { producer => { name => 'bob', producer_to_cd => [ { cd => { artist => { name => 'lars', cds => [ { title => 'Greatest hits 2', year => 2012, genre => $greatest_collections, tags => [ { tag => 'A' }, { tag => 'B' }, ], # This cd is created via artist so it doesn't know about producers cd_to_producer => [ # if we specify 'bob' here things bomb # as the producer attached to Greatest Hits 1 is # already created, but not yet inserted. # Maybe this can be fixed, but things are hairy # enough already. # #{ producer => { name => 'bob' } }, { producer => { name => 'paul' } }, { producer => { name => 'flemming', producer_to_cd => [ { cd => { artist => { name => 'kirk', cds => [ { title => 'Greatest hits 3', year => 2012, genre => $greatest_collections, tags => [ { tag => 'A' } , { tag => 'B' } , ] , } , { title => 'Greatest hits 4', year => 2012, genre => $greatest_collections2, tags => [ { tag => 'A' } , { tag => 'B' } , ] , } , ] , }, title => 'Greatest hits 5', year => 2013, genre => $greatest_collections2, } }, ], } }, ], }, ], }, title => 'Greatest hits 6', year => 2012, genre => $greatest_collections, tags => [ { tag => 'A' }, { tag => 'B' }, ], }, }, { cd => { artist => { name => 'lars', # in recursive_update this creates a new artist - since no id provided # in original create - # should already exist # even though the artist 'name' is not uniquely constrained # find_or_create will arguably DWIM }, title => 'Greatest hits 7', year => 2013, }, }, ], }, }, ], } ); is( $schema->resultset('Artist')->count, $counts->{Artist} + 4, '4 new artists created' ); is( $schema->resultset('Genre')->count, $counts->{Genre} + 2, '2 additional genres created' ); is( $schema->resultset('Producer')->count, $counts->{Producer} + 3, '3 new producer' ); is( $schema->resultset('CD')->count, $counts->{CD} + 7, '7 new CDs' ); is( $schema->resultset('Tag')->count, $counts->{Tag} + 10, '10 new Tags' ); my $cd_rs = $schema->resultset('CD') ->search( { title => { -like => 'Greatest hits %' } }, { order_by => 'title' } ); is( $cd_rs->count, 7, '7 greatest hits created' ); my $cds_2012 = $cd_rs->search( { year => 2012 } ); is( $cds_2012->count, 5, '5 CDs created in 2012' ); is( $cds_2012->search( { 'tags.tag' => { -in => [qw/A B/] } }, { join => 'tags', group_by => 'me.cdid' } ), 5, 'All 10 tags were pairwise distributed between 5 year-2012 CDs' ); my $paul_prod = $cd_rs->search( { 'producer.name' => 'paul' }, { join => { cd_to_producer => 'producer' } } ); is( $paul_prod->count, 1, 'Paul had 1 production' ); my $pauls_cd = $paul_prod->single; is( $pauls_cd->cd_to_producer->count, 2, 'Paul had one co-producer' ); is( $pauls_cd->search_related( 'cd_to_producer', { 'producer.name' => 'flemming' }, { join => 'producer' } )->count, 1, 'The second producer is flemming', ); my $kirk_cds = $cd_rs->search( { 'artist.name' => 'kirk' }, { join => 'artist' } ); is( $kirk_cds, 3, 'Kirk had 3 CDs' ); is( $kirk_cds->search( { 'cd_to_producer.cd' => { '!=', undef } }, { join => 'cd_to_producer' }, ), 1, 'Kirk had a producer only on one cd', ); my $lars_cds = $cd_rs->search( { 'artist.name' => 'lars' }, { join => 'artist' } ); is( $lars_cds->count, 3, 'Lars had 3 CDs' ); is( $lars_cds->search( { 'cd_to_producer.cd' => undef }, { join => 'cd_to_producer' }, ), 0, 'Lars always had a producer', ); is( $lars_cds->search_related( 'cd_to_producer', { 'producer.name' => 'flemming' }, { join => 'producer' } )->count, 1, 'Lars produced 1 CD with flemming', ); is( $lars_cds->search_related( 'cd_to_producer', { 'producer.name' => 'bob' }, { join => 'producer' } )->count, 2, 'Lars produced 2 CDs with bob', ); my $bob_prod = $cd_rs->search( { 'producer.name' => 'bob' }, { join => { cd_to_producer => 'producer' } } ); is( $bob_prod->count, 3, 'Bob produced a total of 3 CDs' ); is( $bob_prod->search( { 'artist.name' => 'james' }, { join => 'artist' } )->count, 1, "Bob produced james' only CD", ); }; diag $@ if $@; ## Test for the might_have is allowed empty bug (should check and see if this ## needs patching upstream to DBIC TODO: { todo_skip "DBIx::Class 0.082841 clears cdid primary key of CD after" . "setting the first belongs_to relationship 'artwork'"; use DBIx::Class::ResultSet::RecursiveUpdate; my $cd_rs = $schema->resultset('CD'); my $cd = $cd_rs->first; # add a track to the cd my $track = $schema->resultset('Track')->next; $cd->single_track($track); $cd->update; $cd->discard_changes; ok( $cd->single_track_id, 'cd has a single_track_id' ); ok( $cd->single_track, 'cd has a single_track' ); DBIx::Class::ResultSet::RecursiveUpdate::Functions::recursive_update( resultset => $cd_rs, updates => { artwork => undef, liner_notes => undef, tracks => [ { title => 'hello', pos => '100' } ], single_track => undef, }, object => $cd, ); $cd->discard_changes; is( $cd->single_track, undef, 'Might have deleted' ); }; done_testing(); DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/author-synopsis.t0000644000175000017500000000026213556035664024320 0ustar ahartmaiahartmai#!perl BEGIN { unless ($ENV{AUTHOR_TESTING}) { print qq{1..0 # SKIP these tests are for testing by the author\n}; exit } } use Test::Synopsis; all_synopsis_ok(); DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/author-pod-syntax.t0000644000175000017500000000045413556035664024542 0ustar ahartmaiahartmai#!perl BEGIN { unless ($ENV{AUTHOR_TESTING}) { print qq{1..0 # SKIP these tests are for testing by the author\n}; exit } } # This file was automatically generated by Dist::Zilla::Plugin::PodSyntaxTests. use strict; use warnings; use Test::More; use Test::Pod 1.41; all_pod_files_ok(); DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/author-portability.t0000644000175000017500000000047113556035664024775 0ustar ahartmaiahartmai BEGIN { unless ($ENV{AUTHOR_TESTING}) { print qq{1..0 # SKIP these tests are for testing by the author\n}; exit } } use strict; use warnings; use Test::More; eval 'use Test::Portability::Files'; plan skip_all => 'Test::Portability::Files required for testing portability' if $@; run_tests(); DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/author-pod-coverage.t0000644000175000017500000000053613556035664025010 0ustar ahartmaiahartmai#!perl BEGIN { unless ($ENV{AUTHOR_TESTING}) { print qq{1..0 # SKIP these tests are for testing by the author\n}; exit } } # This file was automatically generated by Dist::Zilla::Plugin::PodCoverageTests. use Test::Pod::Coverage 1.08; use Pod::Coverage::TrustPod; all_pod_coverage_ok({ coverage_class => 'Pod::Coverage::TrustPod' }); DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/release-unused-vars.t0000644000175000017500000000057113556035664025026 0ustar ahartmaiahartmai#!perl BEGIN { unless ($ENV{RELEASE_TESTING}) { print qq{1..0 # SKIP these tests are for release candidate testing\n}; exit } } use Test::More 0.96 tests => 1; eval { require Test::Vars }; SKIP: { skip 1 => 'Test::Vars required for testing for unused vars' if $@; Test::Vars->import; subtest 'unused vars' => sub { all_vars_ok(); }; }; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/release-has-version.t0000644000175000017500000000044413556035664025007 0ustar ahartmaiahartmai#!perl BEGIN { unless ($ENV{RELEASE_TESTING}) { print qq{1..0 # SKIP these tests are for release candidate testing\n}; exit } } use Test::More; eval "use Test::HasVersion"; plan skip_all => "Test::HasVersion required for testing version numbers" if $@; all_pm_version_ok(); DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/conditional_has_many.t0000644000175000017500000000416013556035664025314 0ustar ahartmaiahartmaiuse strict; use warnings; use lib 't/lib'; use Test::More; use Test::Exception; use_ok 'AnotherTestDB::OnePK::Schema'; my $schema = AnotherTestDB::OnePK::Schema->connect('dbi:SQLite:dbname=:memory:'); isa_ok $schema, 'DBIx::Class::Schema'; lives_ok( sub{ #$schema->deploy({add_drop_table => 1}); $schema->deploy(); $schema->populate('Item', [ [ qw/idcol/ ], [ 1 ], ]); $schema->populate('RelatedItem', [ [ qw/idcol item_id/ ], [ 1, 1 ], [ 2, 1 ], ]); $schema->populate('ConditionItem', [ [ qw/idcol rel_item_id condition/ ], [ 1, 1, 'false' ], [ 2, 1, 'true' ], [ 3, 2, 'true' ], [ 4, 2, 'false' ], ]); }, 'creating and populating test database' ); is($schema->resultset('Item')->find(1)->relateditems->count, 2); is($schema->resultset('Item')->find(1)->true_relateditems->count, 2); lives_ok(sub{ $schema->resultset('Item')->recursive_update({ idcol => 1, true_relateditems => [{ idcol => 1}], }); }); is($schema->resultset('Item')->find(1)->relateditems->count, 1); is($schema->resultset('Item')->find(1)->true_relateditems->count, 1); use_ok 'AnotherTestDB::TwoPK::Schema'; $schema = AnotherTestDB::TwoPK::Schema->connect('dbi:SQLite:dbname=:memory:'); isa_ok $schema, 'DBIx::Class::Schema'; lives_ok( sub{ #$schema->deploy({add_drop_table => 1}); $schema->deploy(); $schema->populate('Item', [ [ qw/idcol/ ], [ 1 ], ]); $schema->populate('RelatedItem', [ [ qw/idcol item_id/ ], [ 1, 1 ], [ 2, 1 ], ]); $schema->populate('ConditionItem', [ [ qw/idcol rel_item_id condition/ ], [ 1, 1, 'false' ], [ 2, 1, 'true' ], [ 3, 2, 'true' ], [ 4, 2, 'false' ], ]); }, 'creating and populating test database' ); is($schema->resultset('Item')->find({idcol => 1})->relateditems->count, 2); is($schema->resultset('Item')->find({idcol => 1})->true_relateditems->count, 2); lives_ok(sub{ $schema->resultset('Item')->recursive_update({ idcol => 1, true_relateditems => [{ idcol => 1, item_id => 1, }], }); }); is($schema->resultset('Item')->find({idcol => 1})->relateditems->count, 1); is($schema->resultset('Item')->find({idcol => 1})->true_relateditems->count, 1); done_testing; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/release-dist-manifest.t0000644000175000017500000000043713556035664025322 0ustar ahartmaiahartmai#!perl BEGIN { unless ($ENV{RELEASE_TESTING}) { print qq{1..0 # SKIP these tests are for release candidate testing\n}; exit } } use Test::More; eval "use Test::DistManifest"; plan skip_all => "Test::DistManifest required for testing the manifest" if $@; manifest_ok(); DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/belongs_to_including_pks.t0000644000175000017500000000332713556035664026202 0ustar ahartmaiahartmaiuse strict; use warnings; use Test::More; use Test::Exception; use lib qw(t/lib); use DBICTest; my $schema = DBICTest->init_schema(); diag 'Update foreign key with an updated primary key (similar to "Create foreign key col obj including PK" in 96multi_create.t)'; eval { my $new_cd_hashref = { cdid => 30, title => 'Boogie Woogie', year => '2007', artist => { artistid => 1 } }; my $cd = $schema->resultset("CD")->find(1); is( $cd->artist->id, 1, 'rel okay' ); my $new_cd = $schema->resultset("CD")->recursive_update($new_cd_hashref); is( $new_cd->artist->id, 1, 'new id retained okay' ); }; eval { my $updated_cd = $schema->resultset("CD")->recursive_update( { cdid => 30, title => 'Boogie Wiggle', year => '2007', artist => { artistid => 2 } } ); is( $updated_cd->artist->id, 2, 'related artist changed correctly' ); }; is( $@, '', 'new cd created without clash on related artist' ); diag 'The same as the last test, but on a relationship with accessor "single".'; eval { my $new_lyrics_hashref = { lyric_id => 1, track => { trackid => 4 } }; my $new_lyric = $schema->resultset("Lyrics")->recursive_update($new_lyrics_hashref); is( $new_lyric->track->trackid, 4, 'new id retained okay' ); }; eval { my $updated_lyric = $schema->resultset("Lyrics")->recursive_update( { lyric_id => 1, track => { trackid => 5 } } ); is( $updated_lyric->track->trackid, 5, 'related artist changed correctly' ); }; is( $@, '', 'new cd created without clash on related artist' ); done_testing; # vim: set ft=perl ts=4 expandtab: DBIx-Class-ResultSet-RecursiveUpdate-0.40/t/update_introspectable_m2m.t0000644000175000017500000002563413556035664026276 0ustar ahartmaiahartmai# Note: # # I am using DebugObject in t/lib to catch the DBIC debug output # and regexes to check the messages in order to find out what RU # really did. # # I think that this is a bad Idea. If the queries produced by # DBIC change in the future, these tests might fail even though # DBIC and RU still behave the same. # # I currently have no better idea how to find out weather RU # called set_$rel for M2Ms or not. # (It shouldn't if IntrospectableM2M is in use) # # I prefered this solution over monkeypatching DBIC, which was my # second idea. Any hints are highly welcome! # # - lukast use strict; use warnings; use Test::More; use DBIx::Class::ResultSet::RecursiveUpdate; use lib 't/lib'; use DBSchema; use DebugObject; my $schema = DBSchema->get_test_schema(); my $storage = $schema->storage; isa_ok $schema, "DBIx::Class::Schema"; isa_ok $storage, "DBIx::Class::Storage"; my $dbic_trace = DebugObject->new; $storage->debug(1); $storage->debugcb(sub { $dbic_trace->print($_[1]) }); my $dvd_rs = $schema->resultset('Dvd'); my $tag_rs = $schema->resultset('Tag'); ok $dvd_rs->result_class->can("_m2m_metadata"), "dvd-rs has m2m metadata"; ok ! $tag_rs->result_class->can("_m2m_metadata"), "tag-rs has no m2m metadata"; ############################################## # testing m2m updates with IntrospectableM2M # ############################################## my $dvd_item = $dvd_rs->first; # # adding one # my $tag_ids = [$dvd_item->tags_rs->get_column("id")->all]; push @$tag_ids, 1; my %updates = ( dvd_id => $dvd_item->id, tags => $tag_ids, ); $dbic_trace->clear; $dvd_rs->recursive_update(\%updates); ok ! $dbic_trace->count_messages('^DELETE FROM dvdtag WHERE \( dvd = \? \)'), "add one: update did not remove all tags'"; is $dbic_trace->count_messages("^DELETE FROM dvdtag "), 0, "add one: update executed no delete"; is $dbic_trace->count_messages("^INSERT INTO dvdtag "), 1, "add one: update executed one insert"; is $dvd_item->tags_rs->count, 3, "add one: DVD item has 3 tags"; # # removing one # shift @$tag_ids; %updates = ( dvd_id => $dvd_item->id, tags => $tag_ids, ); $dbic_trace->clear; $dvd_rs->recursive_update(\%updates); ok ! $dbic_trace->count_messages('^DELETE FROM dvdtag WHERE \( dvd = \? \)'), "remove one: update did not remove all tags'"; is $dbic_trace->count_messages("^DELETE FROM dvdtag "), 1, "remove one: update executed one delete"; is $dbic_trace->count_messages("^INSERT INTO dvdtag "), 0, "remove one: update executed no insert"; is $dvd_item->tags_rs->count, 2, "remove one: DVD item has 2 tags"; # # adding recursive # #push @$tag_ids, ( 4, 5, 6 ); %updates = ( dvd_id => $dvd_item->id, tags => [ (map { { name => $_->name, id => $_->id } } $dvd_item->tags->all) , { name => "winnie" }, { name => "fanny" }, { name => "sammy" }, ], ); $dbic_trace->clear; $dvd_rs->recursive_update(\%updates); ok ! $dbic_trace->count_messages('^DELETE FROM dvdtag WHERE \( dvd = \? \)'), "add several: update did not remove all tags'"; is $dbic_trace->count_messages("^DELETE FROM dvdtag "), 0, "add several: update executed no delete"; is $dbic_trace->count_messages("^INSERT INTO dvdtag "), 3, "add several: update executed three inserts in dvdtag"; is $dbic_trace->count_messages("^INSERT INTO tag "), 3, "add several: update executed three inserts in tag"; is $dvd_item->tags_rs->count, 5, "add several: DVD item has 5 tags"; # # updating recursive # #push @$tag_ids, ( 4, 5, 6 ); %updates = ( dvd_id => $dvd_item->id, tags => [ (map { { name => $_->name."_Changed", id => $_->id } } $dvd_item->tags->all) , ], ); $dbic_trace->clear; $dvd_rs->recursive_update(\%updates); ok ! $dbic_trace->count_messages('^DELETE FROM dvdtag WHERE \( dvd = \? \)'), "add several: update did not remove all tags'"; is $dbic_trace->count_messages("^DELETE FROM dvdtag "), 0, "add several: update executed no delete"; is $dbic_trace->count_messages("^INSERT INTO dvdtag "), 0, "add several: update executed no inserts in dvdtag"; is $dbic_trace->count_messages("^UPDATE tag "), 5, "add several: update executed five updates in tag"; is $dvd_item->tags_rs->count, 5, "add several: DVD item has 5 tags"; # # updating and removing # %updates = ( dvd_id => $dvd_item->id, tags => [ (map { { name => $_->name."More", id => $_->id } } $dvd_item->tags->all) , ], ); $updates{tags} = [splice @{$updates{tags}}, 2, 3]; $dbic_trace->clear; $dvd_rs->recursive_update(\%updates); ok ! $dbic_trace->count_messages('^DELETE FROM dvdtag WHERE \( dvd = \? \)'), "add several: update did not remove all tags'"; is $dbic_trace->count_messages("^DELETE FROM dvdtag "), 1, "add several: update executed one delete"; is $dbic_trace->count_messages("^INSERT INTO dvdtag "), 0, "add several: update executed no inserts in dvdtag"; is $dbic_trace->count_messages("^UPDATE tag "), 3, "add several: update executed three updates in tag"; is $dvd_item->tags_rs->count, 3, "add several: DVD item has 3 tags"; # # updating and adding # %updates = ( dvd_id => $dvd_item->id, tags => [ (map { { name => $_->name."More", id => $_->id } } $dvd_item->tags->all) , { name => "rob" }, { name => "bot" }, ], ); $dbic_trace->clear; $dvd_rs->recursive_update(\%updates); ok ! $dbic_trace->count_messages('^DELETE FROM dvdtag WHERE \( dvd = \? \)'), "add several: update did not remove all tags'"; is $dbic_trace->count_messages("^DELETE FROM dvdtag "), 0, "add several: update executed no delete"; is $dbic_trace->count_messages("^INSERT INTO dvdtag "), 2, "add several: update executed two inserts in dvdtag"; is $dbic_trace->count_messages("^UPDATE tag "), 3, "add several: update executed three updates in tag"; is $dvd_item->tags_rs->count, 5, "add several: DVD item has 5 tags"; # # removing several # $tag_ids = [4,5]; %updates = ( dvd_id => $dvd_item->id, tags => $tag_ids, ); $dbic_trace->clear; $dvd_rs->recursive_update(\%updates); ok ! $dbic_trace->count_messages('^DELETE FROM dvdtag WHERE \( dvd = \? \)'), "remove several: update did not remove all tags'"; is $dbic_trace->count_messages("^DELETE FROM dvdtag "), 1, "remove several: update executed one delete"; is $dbic_trace->count_messages("^INSERT INTO dvdtag "), 0, "remove several: update executed no insert"; is $dvd_item->tags_rs->count, 2, "remove several: DVD item has 2 tags"; # # empty arrayref # $tag_ids = []; %updates = ( dvd_id => $dvd_item->id, tags => $tag_ids, ); $dbic_trace->clear; $dvd_rs->recursive_update(\%updates); ok $dbic_trace->count_messages('^DELETE FROM dvdtag WHERE \( dvd = \? \)'), "remove all: update did remove all tags'"; is $dbic_trace->count_messages("^INSERT INTO dvdtag "), 0, "remove all: update executed no insert"; is $dvd_item->tags_rs->count, 0, "remove all: DVD item has no tags"; # # old set_$rel behaviour # $tag_ids = [2,4]; %updates = ( dvd_id => $dvd_item->id, tags => $tag_ids, ); $dbic_trace->clear; $dvd_rs->recursive_update(\%updates, {m2m_force_set_rel => 1}); ok $dbic_trace->count_messages('^DELETE FROM dvdtag WHERE \( dvd = \? \)'), "remove several: update did remove all tags'"; is $dbic_trace->count_messages("^INSERT INTO dvdtag "), 2, "remove several: update executed 2 insert"; is $dvd_item->tags_rs->count, 2, "remove several: DVD item has 2 tags"; # doint this 2 times to test identical behaviour $tag_ids = [2,4]; %updates = ( dvd_id => $dvd_item->id, tags => $tag_ids, ); $dbic_trace->clear; $dvd_rs->recursive_update(\%updates, {m2m_force_set_rel => 1}); ok $dbic_trace->count_messages('^DELETE FROM dvdtag WHERE \( dvd = \? \)'), "remove several: update did remove all tags'"; is $dbic_trace->count_messages("^INSERT INTO dvdtag "), 2, "remove several: update executed 2 insert"; is $dvd_item->tags_rs->count, 2, "remove several: DVD item has 2 tags"; ################################################# # testing m2m updates without IntrospectableM2M # ################################################# my $tag_item = $tag_rs->first; # # adding one # my $dvd_ids = [$tag_item->dvds_rs->get_column("dvd_id")->all]; push @$dvd_ids, 1; %updates = ( id => $tag_item->id, dvds => $dvd_ids, ); $dbic_trace->clear; $tag_rs->recursive_update(\%updates); ok $dbic_trace->count_messages('^DELETE FROM dvdtag WHERE \( tag = \? \)'), "add one: update did remove all dvds'"; is $dbic_trace->count_messages("^DELETE FROM dvdtag "), 1, "add one: update executed one delete"; is $dbic_trace->count_messages("^INSERT INTO dvdtag "), 3, "add one: update executed three insert"; is $tag_item->dvds_rs->count, 3, "add one: tag item has 3 dvds"; # # removing one # shift @$dvd_ids; %updates = ( id => $tag_item->id, dvds => $dvd_ids, ); $dbic_trace->clear; $tag_rs->recursive_update(\%updates); ok $dbic_trace->count_messages('^DELETE FROM dvdtag WHERE \( tag = \? \)'), "remove one: update did remove all dvds'"; is $dbic_trace->count_messages("^DELETE FROM dvdtag "), 1, "remove one: update executed one delete"; is $dbic_trace->count_messages("^INSERT INTO dvdtag "), 2, "remove one: update executed two insert"; is $tag_item->dvds_rs->count, 2, "remove one: tag item has 2 dvds"; # # adding recursive # #push @$dvd_ids, ( 4, 5, 6 ); %updates = ( id => $tag_item->id, dvds => [ (map { { name => $_->name, dvd_id => $_->id } } $tag_item->dvds->all) , { name => "winnie", owner => 1 }, { name => "fanny" , owner => 1}, { name => "sammy" , owner => 1}, ], ); $dbic_trace->clear; $tag_rs->recursive_update(\%updates); ok $dbic_trace->count_messages('^DELETE FROM dvdtag WHERE \( tag = \? \)'), "add several: update did remove all dvds'"; is $dbic_trace->count_messages("^DELETE FROM dvdtag "), 1, "add several: update executed one delete"; is $dbic_trace->count_messages("^INSERT INTO dvdtag "), 5, "add several: update executed five inserts in dvdtag"; is $dbic_trace->count_messages("^INSERT INTO dvd "), 3, "add several: update executed three inserts in dvd"; is $tag_item->dvds_rs->count, 5, "add several: tag item has 5 dvds"; # # removing several # $dvd_ids = [3,5]; %updates = ( id => $tag_item->id, dvds => $dvd_ids, ); $dbic_trace->clear; $tag_rs->recursive_update(\%updates); ok $dbic_trace->count_messages('^DELETE FROM dvdtag WHERE \( tag = \? \)'), "remove several: update did remove all dvds'"; is $dbic_trace->count_messages("^DELETE FROM dvdtag "), 1, "remove several: update executed one delete"; is $dbic_trace->count_messages("^INSERT INTO dvdtag "), 2, "remove several: update executed two insert"; is $tag_item->dvds_rs->count, 2, "remove several: tag item has 2 dvds"; # # empty arrayref # $dvd_ids = []; %updates = ( id => $tag_item->id, dvds => $dvd_ids, ); $dbic_trace->clear; $tag_rs->recursive_update(\%updates); ok $dbic_trace->count_messages('^DELETE FROM dvdtag WHERE \( tag = \? \)'), "remove all: update did remove all dvds'"; is $dbic_trace->count_messages("^INSERT INTO dvdtag "), 0, "remove all: update executed no insert"; is $tag_item->dvds_rs->count, 0, "remove all: tag item has no dvds"; done_testing; DBIx-Class-ResultSet-RecursiveUpdate-0.40/LICENSE0000644000175000017500000004406413556035664021516 0ustar ahartmaiahartmaiThis software is copyright (c) 2019 by Zbigniew Lukasiak, John Napiorkowski, Alexander Hartmaier. This is free software; you can redistribute it and/or modify it under the same terms as the Perl 5 programming language system itself. Terms of the Perl programming language system itself a) the GNU General Public License as published by the Free Software Foundation; either version 1, or (at your option) any later version, or b) the "Artistic License" --- The GNU General Public License, Version 1, February 1989 --- This software is Copyright (c) 2019 by Zbigniew Lukasiak, John Napiorkowski, Alexander Hartmaier. This is free software, licensed under: The GNU General Public License, Version 1, February 1989 GNU GENERAL PUBLIC LICENSE Version 1, February 1989 Copyright (C) 1989 Free Software Foundation, Inc. 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed. Preamble The license agreements of most software companies try to keep users at the mercy of those companies. By contrast, our General Public License is intended to guarantee your freedom to share and change free software--to make sure the software is free for all its users. The General Public License applies to the Free Software Foundation's software and to any other program whose authors commit to using it. You can use it for your programs, too. When we speak of free software, we are referring to freedom, not price. Specifically, the General Public License is designed to make sure that you have the freedom to give away or sell copies of free software, that you receive source code or can get it if you want it, that you can change the software or use pieces of it in new free programs; and that you know you can do these things. To protect your rights, we need to make restrictions that forbid anyone to deny you these rights or to ask you to surrender the rights. These restrictions translate to certain responsibilities for you if you distribute copies of the software, or if you modify it. For example, if you distribute copies of a such a program, whether gratis or for a fee, you must give the recipients all the rights that you have. You must make sure that they, too, receive or can get the source code. And you must tell them their rights. We protect your rights with two steps: (1) copyright the software, and (2) offer you this license which gives you legal permission to copy, distribute and/or modify the software. Also, for each author's protection and ours, we want to make certain that everyone understands that there is no warranty for this free software. If the software is modified by someone else and passed on, we want its recipients to know that what they have is not the original, so that any problems introduced by others will not reflect on the original authors' reputations. The precise terms and conditions for copying, distribution and modification follow. GNU GENERAL PUBLIC LICENSE TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION 0. This License Agreement applies to any program or other work which contains a notice placed by the copyright holder saying it may be distributed under the terms of this General Public License. The "Program", below, refers to any such program or work, and a "work based on the Program" means either the Program or any work containing the Program or a portion of it, either verbatim or with modifications. Each licensee is addressed as "you". 1. You may copy and distribute verbatim copies of the Program's source code as you receive it, in any medium, provided that you conspicuously and appropriately publish on each copy an appropriate copyright notice and disclaimer of warranty; keep intact all the notices that refer to this General Public License and to the absence of any warranty; and give any other recipients of the Program a copy of this General Public License along with the Program. You may charge a fee for the physical act of transferring a copy. 2. You may modify your copy or copies of the Program or any portion of it, and copy and distribute such modifications under the terms of Paragraph 1 above, provided that you also do the following: a) cause the modified files to carry prominent notices stating that you changed the files and the date of any change; and b) cause the whole of any work that you distribute or publish, that in whole or in part contains the Program or any part thereof, either with or without modifications, to be licensed at no charge to all third parties under the terms of this General Public License (except that you may choose to grant warranty protection to some or all third parties, at your option). c) If the modified program normally reads commands interactively when run, you must cause it, when started running for such interactive use in the simplest and most usual way, to print or display an announcement including an appropriate copyright notice and a notice that there is no warranty (or else, saying that you provide a warranty) and that users may redistribute the program under these conditions, and telling the user how to view a copy of this General Public License. d) You may charge a fee for the physical act of transferring a copy, and you may at your option offer warranty protection in exchange for a fee. Mere aggregation of another independent work with the Program (or its derivative) on a volume of a storage or distribution medium does not bring the other work under the scope of these terms. 3. You may copy and distribute the Program (or a portion or derivative of it, under Paragraph 2) in object code or executable form under the terms of Paragraphs 1 and 2 above provided that you also do one of the following: a) accompany it with the complete corresponding machine-readable source code, which must be distributed under the terms of Paragraphs 1 and 2 above; or, b) accompany it with a written offer, valid for at least three years, to give any third party free (except for a nominal charge for the cost of distribution) a complete machine-readable copy of the corresponding source code, to be distributed under the terms of Paragraphs 1 and 2 above; or, c) accompany it with the information you received as to where the corresponding source code may be obtained. (This alternative is allowed only for noncommercial distribution and only if you received the program in object code or executable form alone.) Source code for a work means the preferred form of the work for making modifications to it. For an executable file, complete source code means all the source code for all modules it contains; but, as a special exception, it need not include source code for modules which are standard libraries that accompany the operating system on which the executable file runs, or for standard header files or definitions files that accompany that operating system. 4. You may not copy, modify, sublicense, distribute or transfer the Program except as expressly provided under this General Public License. Any attempt otherwise to copy, modify, sublicense, distribute or transfer the Program is void, and will automatically terminate your rights to use the Program under this License. However, parties who have received copies, or rights to use copies, from you under this General Public License will not have their licenses terminated so long as such parties remain in full compliance. 5. By copying, distributing or modifying the Program (or any work based on the Program) you indicate your acceptance of this license to do so, and all its terms and conditions. 6. Each time you redistribute the Program (or any work based on the Program), the recipient automatically receives a license from the original licensor to copy, distribute or modify the Program subject to these terms and conditions. You may not impose any further restrictions on the recipients' exercise of the rights granted herein. 7. The Free Software Foundation may publish revised and/or new versions of the General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns. Each version is given a distinguishing version number. If the Program specifies a version number of the license which applies to it and "any later version", you have the option of following the terms and conditions either of that version or of any later version published by the Free Software Foundation. If the Program does not specify a version number of the license, you may choose any version ever published by the Free Software Foundation. 8. If you wish to incorporate parts of the Program into other free programs whose distribution conditions are different, write to the author to ask for permission. For software which is copyrighted by the Free Software Foundation, write to the Free Software Foundation; we sometimes make exceptions for this. Our decision will be guided by the two goals of preserving the free status of all derivatives of our free software and of promoting the sharing and reuse of software generally. NO WARRANTY 9. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION. 10. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. END OF TERMS AND CONDITIONS Appendix: How to Apply These Terms to Your New Programs If you develop a new program, and you want it to be of the greatest possible use to humanity, the best way to achieve this is to make it free software which everyone can redistribute and change under these terms. To do so, attach the following notices to the program. It is safest to attach them to the start of each source file to most effectively convey the exclusion of warranty; and each file should have at least the "copyright" line and a pointer to where the full notice is found. Copyright (C) 19yy This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 1, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program; if not, write to the Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston MA 02110-1301 USA Also add information on how to contact you by electronic and paper mail. If the program is interactive, make it output a short notice like this when it starts in an interactive mode: Gnomovision version 69, Copyright (C) 19xx name of author Gnomovision comes with ABSOLUTELY NO WARRANTY; for details type `show w'. This is free software, and you are welcome to redistribute it under certain conditions; type `show c' for details. The hypothetical commands `show w' and `show c' should show the appropriate parts of the General Public License. Of course, the commands you use may be called something other than `show w' and `show c'; they could even be mouse-clicks or menu items--whatever suits your program. You should also get your employer (if you work as a programmer) or your school, if any, to sign a "copyright disclaimer" for the program, if necessary. Here a sample; alter the names: Yoyodyne, Inc., hereby disclaims all copyright interest in the program `Gnomovision' (a program to direct compilers to make passes at assemblers) written by James Hacker. , 1 April 1989 Ty Coon, President of Vice That's all there is to it! --- The Artistic License 1.0 --- This software is Copyright (c) 2019 by Zbigniew Lukasiak, John Napiorkowski, Alexander Hartmaier. This is free software, licensed under: The Artistic License 1.0 The Artistic License Preamble The intent of this document is to state the conditions under which a Package may be copied, such that the Copyright Holder maintains some semblance of artistic control over the development of the package, while giving the users of the package the right to use and distribute the Package in a more-or-less customary fashion, plus the right to make reasonable modifications. Definitions: - "Package" refers to the collection of files distributed by the Copyright Holder, and derivatives of that collection of files created through textual modification. - "Standard Version" refers to such a Package if it has not been modified, or has been modified in accordance with the wishes of the Copyright Holder. - "Copyright Holder" is whoever is named in the copyright or copyrights for the package. - "You" is you, if you're thinking about copying or distributing this Package. - "Reasonable copying fee" is whatever you can justify on the basis of media cost, duplication charges, time of people involved, and so on. (You will not be required to justify it to the Copyright Holder, but only to the computing community at large as a market that must bear the fee.) - "Freely Available" means that no fee is charged for the item itself, though there may be fees involved in handling the item. It also means that recipients of the item may redistribute it under the same conditions they received it. 1. You may make and give away verbatim copies of the source form of the Standard Version of this Package without restriction, provided that you duplicate all of the original copyright notices and associated disclaimers. 2. You may apply bug fixes, portability fixes and other modifications derived from the Public Domain or from the Copyright Holder. A Package modified in such a way shall still be considered the Standard Version. 3. You may otherwise modify your copy of this Package in any way, provided that you insert a prominent notice in each changed file stating how and when you changed that file, and provided that you do at least ONE of the following: a) place your modifications in the Public Domain or otherwise make them Freely Available, such as by posting said modifications to Usenet or an equivalent medium, or placing the modifications on a major archive site such as ftp.uu.net, or by allowing the Copyright Holder to include your modifications in the Standard Version of the Package. b) use the modified Package only within your corporation or organization. c) rename any non-standard executables so the names do not conflict with standard executables, which must also be provided, and provide a separate manual page for each non-standard executable that clearly documents how it differs from the Standard Version. d) make other distribution arrangements with the Copyright Holder. 4. You may distribute the programs of this Package in object code or executable form, provided that you do at least ONE of the following: a) distribute a Standard Version of the executables and library files, together with instructions (in the manual page or equivalent) on where to get the Standard Version. b) accompany the distribution with the machine-readable source of the Package with your modifications. c) accompany any non-standard executables with their corresponding Standard Version executables, giving the non-standard executables non-standard names, and clearly documenting the differences in manual pages (or equivalent), together with instructions on where to get the Standard Version. d) make other distribution arrangements with the Copyright Holder. 5. You may charge a reasonable copying fee for any distribution of this Package. You may charge any fee you choose for support of this Package. You may not charge a fee for this Package itself. However, you may distribute this Package in aggregate with other (possibly commercial) programs as part of a larger (possibly commercial) software distribution provided that you do not advertise this Package as a product of your own. 6. The scripts and library files supplied as input to or produced as output from the programs of this Package do not automatically fall under the copyright of this Package, but belong to whomever generated them, and may be sold commercially, and may be aggregated with this Package. 7. C or perl subroutines supplied by you and linked into this Package shall not be considered part of this Package. 8. The name of the Copyright Holder may not be used to endorse or promote products derived from this software without specific prior written permission. 9. THIS PACKAGE IS PROVIDED "AS IS" AND WITHOUT ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, WITHOUT LIMITATION, THE IMPLIED WARRANTIES OF MERCHANTIBILITY AND FITNESS FOR A PARTICULAR PURPOSE. The End DBIx-Class-ResultSet-RecursiveUpdate-0.40/dist.ini0000644000175000017500000000231113556035664022142 0ustar ahartmaiahartmainame = DBIx-Class-ResultSet-RecursiveUpdate author = Zbigniew Lukasiak author = John Napiorkowski author = Alexander Hartmaier author = Gerda Shank license = Perl_5 copyright_holder = Zbigniew Lukasiak, John Napiorkowski, Alexander Hartmaier copyright_year = 2019 version = 0.40 [@Basic] [PodWeaver] [PkgVersion] [NextRelease] [MetaConfig] [MetaJSON] [MetaNoIndex] directory = t/lib directory = t_dbic/lib [MetaResources] repository.type = git repository.url = git://github.com/gshank/dbix-class-resultset-recursiveupdate repository.web = http://github.com/gshank/dbix-class-resultset-recursiveupdate [PodSyntaxTests] [PodCoverageTests] [Test::Portability] [Test::DistManifest] [Test::Synopsis] [Test::UnusedVars] [HasVersionTests] [@Git] commit_msg = version %v%n%n%c tag_format = %v tag_message = %v [Prereqs] DBIx::Class = 0.08103 DBIx::Class::IntrospectableM2M = 0 SQL::Translator = 0.11016 DateTime = 0 DBD::SQLite = 1.21 List::MoreUtils = 0.22 Carp::Clan = 6.04 Data::Dumper::Concise = 2.020 Try::Tiny = 0.30 [Prereqs / TestRequires] Test::More = 0.88 Test::Warn = 0.20 Test::Trap = 0.2.2 Test::DBIC::ExpectedQueries = 2.001 DBIx-Class-ResultSet-RecursiveUpdate-0.40/META.yml0000644000175000017500000002350013556035664021752 0ustar ahartmaiahartmai--- abstract: 'like update_or_create - but recursive' author: - 'Zbigniew Lukasiak ' - 'John Napiorkowski ' - 'Alexander Hartmaier ' - 'Gerda Shank ' build_requires: Test::DBIC::ExpectedQueries: '2.001' Test::More: '0.88' Test::Trap: v0.2.2 Test::Warn: '0.20' configure_requires: ExtUtils::MakeMaker: '0' dynamic_config: 0 generated_by: 'Dist::Zilla version 6.012, CPAN::Meta::Converter version 2.150010' license: perl meta-spec: url: http://module-build.sourceforge.net/META-spec-v1.4.html version: '1.4' name: DBIx-Class-ResultSet-RecursiveUpdate no_index: directory: - t/lib - t_dbic/lib requires: Carp::Clan: '6.04' DBD::SQLite: '1.21' DBIx::Class: '0.08103' DBIx::Class::IntrospectableM2M: '0' Data::Dumper::Concise: '2.020' DateTime: '0' List::MoreUtils: '0.22' SQL::Translator: '0.11016' Try::Tiny: '0.30' resources: repository: git://github.com/gshank/dbix-class-resultset-recursiveupdate version: '0.40' x_Dist_Zilla: perl: version: '5.030000' plugins: - class: Dist::Zilla::Plugin::GatherDir config: Dist::Zilla::Plugin::GatherDir: exclude_filename: [] exclude_match: [] follow_symlinks: 0 include_dotfiles: 0 prefix: '' prune_directory: [] root: . name: '@Basic/GatherDir' version: '6.012' - class: Dist::Zilla::Plugin::PruneCruft name: '@Basic/PruneCruft' version: '6.012' - class: Dist::Zilla::Plugin::ManifestSkip name: '@Basic/ManifestSkip' version: '6.012' - class: Dist::Zilla::Plugin::MetaYAML name: '@Basic/MetaYAML' version: '6.012' - class: Dist::Zilla::Plugin::License name: '@Basic/License' version: '6.012' - class: Dist::Zilla::Plugin::Readme name: '@Basic/Readme' version: '6.012' - class: Dist::Zilla::Plugin::ExtraTests name: '@Basic/ExtraTests' version: '6.012' - class: Dist::Zilla::Plugin::ExecDir name: '@Basic/ExecDir' version: '6.012' - class: Dist::Zilla::Plugin::ShareDir name: '@Basic/ShareDir' version: '6.012' - class: Dist::Zilla::Plugin::MakeMaker config: Dist::Zilla::Role::TestRunner: default_jobs: 1 name: '@Basic/MakeMaker' version: '6.012' - class: Dist::Zilla::Plugin::Manifest name: '@Basic/Manifest' version: '6.012' - class: Dist::Zilla::Plugin::TestRelease name: '@Basic/TestRelease' version: '6.012' - class: Dist::Zilla::Plugin::ConfirmRelease name: '@Basic/ConfirmRelease' version: '6.012' - class: Dist::Zilla::Plugin::UploadToCPAN name: '@Basic/UploadToCPAN' version: '6.012' - class: Dist::Zilla::Plugin::PodWeaver config: Dist::Zilla::Plugin::PodWeaver: finder: - ':InstallModules' - ':ExecFiles' plugins: - class: Pod::Weaver::Plugin::EnsurePod5 name: '@CorePrep/EnsurePod5' version: '4.015' - class: Pod::Weaver::Plugin::H1Nester name: '@CorePrep/H1Nester' version: '4.015' - class: Pod::Weaver::Plugin::SingleEncoding name: '@Default/SingleEncoding' version: '4.015' - class: Pod::Weaver::Section::Name name: '@Default/Name' version: '4.015' - class: Pod::Weaver::Section::Version name: '@Default/Version' version: '4.015' - class: Pod::Weaver::Section::Region name: '@Default/prelude' version: '4.015' - class: Pod::Weaver::Section::Generic name: SYNOPSIS version: '4.015' - class: Pod::Weaver::Section::Generic name: DESCRIPTION version: '4.015' - class: Pod::Weaver::Section::Generic name: OVERVIEW version: '4.015' - class: Pod::Weaver::Section::Collect name: ATTRIBUTES version: '4.015' - class: Pod::Weaver::Section::Collect name: METHODS version: '4.015' - class: Pod::Weaver::Section::Collect name: FUNCTIONS version: '4.015' - class: Pod::Weaver::Section::Leftovers name: '@Default/Leftovers' version: '4.015' - class: Pod::Weaver::Section::Region name: '@Default/postlude' version: '4.015' - class: Pod::Weaver::Section::Authors name: '@Default/Authors' version: '4.015' - class: Pod::Weaver::Section::Legal name: '@Default/Legal' version: '4.015' name: PodWeaver version: '4.008' - class: Dist::Zilla::Plugin::PkgVersion name: PkgVersion version: '6.012' - class: Dist::Zilla::Plugin::NextRelease name: NextRelease version: '6.012' - class: Dist::Zilla::Plugin::MetaConfig name: MetaConfig version: '6.012' - class: Dist::Zilla::Plugin::MetaJSON name: MetaJSON version: '6.012' - class: Dist::Zilla::Plugin::MetaNoIndex name: MetaNoIndex version: '6.012' - class: Dist::Zilla::Plugin::MetaResources name: MetaResources version: '6.012' - class: Dist::Zilla::Plugin::PodSyntaxTests name: PodSyntaxTests version: '6.012' - class: Dist::Zilla::Plugin::PodCoverageTests name: PodCoverageTests version: '6.012' - class: Dist::Zilla::Plugin::Test::Portability config: Dist::Zilla::Plugin::Test::Portability: options: '' name: Test::Portability version: '2.001000' - class: Dist::Zilla::Plugin::Test::DistManifest name: Test::DistManifest version: '2.000005' - class: Dist::Zilla::Plugin::Test::Synopsis name: Test::Synopsis version: '2.000007' - class: Dist::Zilla::Plugin::Test::UnusedVars name: Test::UnusedVars version: '2.000007' - class: Dist::Zilla::Plugin::HasVersionTests name: HasVersionTests version: '1.101420' - class: Dist::Zilla::Plugin::Git::Check config: Dist::Zilla::Plugin::Git::Check: untracked_files: die Dist::Zilla::Role::Git::DirtyFiles: allow_dirty: - Changes - dist.ini allow_dirty_match: [] changelog: Changes Dist::Zilla::Role::Git::Repo: git_version: 2.20.1 repo_root: . name: '@Git/Check' version: '2.046' - class: Dist::Zilla::Plugin::Git::Commit config: Dist::Zilla::Plugin::Git::Commit: add_files_in: [] commit_msg: 'version %v%n%n%c' Dist::Zilla::Role::Git::DirtyFiles: allow_dirty: - Changes - dist.ini allow_dirty_match: [] changelog: Changes Dist::Zilla::Role::Git::Repo: git_version: 2.20.1 repo_root: . Dist::Zilla::Role::Git::StringFormatter: time_zone: local name: '@Git/Commit' version: '2.046' - class: Dist::Zilla::Plugin::Git::Tag config: Dist::Zilla::Plugin::Git::Tag: branch: ~ changelog: Changes signed: 0 tag: '0.40' tag_format: '%v' tag_message: '%v' Dist::Zilla::Role::Git::Repo: git_version: 2.20.1 repo_root: . Dist::Zilla::Role::Git::StringFormatter: time_zone: local name: '@Git/Tag' version: '2.046' - class: Dist::Zilla::Plugin::Git::Push config: Dist::Zilla::Plugin::Git::Push: push_to: - origin remotes_must_exist: 1 Dist::Zilla::Role::Git::Repo: git_version: 2.20.1 repo_root: . name: '@Git/Push' version: '2.046' - class: Dist::Zilla::Plugin::Prereqs config: Dist::Zilla::Plugin::Prereqs: phase: runtime type: requires name: Prereqs version: '6.012' - class: Dist::Zilla::Plugin::Prereqs config: Dist::Zilla::Plugin::Prereqs: phase: test type: requires name: TestRequires version: '6.012' - class: Dist::Zilla::Plugin::FinderCode name: ':InstallModules' version: '6.012' - class: Dist::Zilla::Plugin::FinderCode name: ':IncModules' version: '6.012' - class: Dist::Zilla::Plugin::FinderCode name: ':TestFiles' version: '6.012' - class: Dist::Zilla::Plugin::FinderCode name: ':ExtraTestFiles' version: '6.012' - class: Dist::Zilla::Plugin::FinderCode name: ':ExecFiles' version: '6.012' - class: Dist::Zilla::Plugin::FinderCode name: ':PerlExecFiles' version: '6.012' - class: Dist::Zilla::Plugin::FinderCode name: ':ShareFiles' version: '6.012' - class: Dist::Zilla::Plugin::FinderCode name: ':MainModule' version: '6.012' - class: Dist::Zilla::Plugin::FinderCode name: ':AllFiles' version: '6.012' - class: Dist::Zilla::Plugin::FinderCode name: ':NoFiles' version: '6.012' zilla: class: Dist::Zilla::Dist::Builder config: is_trial: '0' version: '6.012' x_generated_by_perl: v5.30.0 x_serialization_backend: 'YAML::Tiny version 1.73' DBIx-Class-ResultSet-RecursiveUpdate-0.40/MANIFEST0000644000175000017500000001552513556035664021642 0ustar ahartmaiahartmai# This file was automatically generated by Dist::Zilla::Plugin::Manifest v6.012. Changes LICENSE MANIFEST META.json META.yml Makefile.PL README dist.ini lib/DBIx/Class/ResultSet/RecursiveUpdate.pm t/00load.t t/01_basic.t t/02_cache.t t/96multi_create.t t/author-pod-coverage.t t/author-pod-syntax.t t/author-portability.t t/author-synopsis.t t/belongs_to_including_pks.t t/conditional_has_many.t t/lib/AnotherTestDB/OnePK/Schema.pm t/lib/AnotherTestDB/OnePK/Schema/Result/ConditionItem.pm t/lib/AnotherTestDB/OnePK/Schema/Result/Item.pm t/lib/AnotherTestDB/OnePK/Schema/Result/RelatedItem.pm t/lib/AnotherTestDB/TwoPK/Schema.pm t/lib/AnotherTestDB/TwoPK/Schema/Result/ConditionItem.pm t/lib/AnotherTestDB/TwoPK/Schema/Result/Item.pm t/lib/AnotherTestDB/TwoPK/Schema/Result/RelatedItem.pm t/lib/DBICTest.pm t/lib/DBICTest/ErrorComponent.pm t/lib/DBICTest/FakeComponent.pm t/lib/DBICTest/ForeignComponent.pm t/lib/DBICTest/ForeignComponent/TestComp.pm t/lib/DBICTest/OptionalComponent.pm t/lib/DBICTest/Plain.pm t/lib/DBICTest/Plain/Test.pm t/lib/DBICTest/ResultSetManager.pm t/lib/DBICTest/ResultSetManager/Foo.pm t/lib/DBICTest/Schema.pm t/lib/DBICTest/Schema/Artist.pm t/lib/DBICTest/Schema/ArtistSourceName.pm t/lib/DBICTest/Schema/ArtistSubclass.pm t/lib/DBICTest/Schema/ArtistUndirectedMap.pm t/lib/DBICTest/Schema/Artwork.pm t/lib/DBICTest/Schema/Artwork_to_Artist.pm t/lib/DBICTest/Schema/Bookmark.pm t/lib/DBICTest/Schema/BooksInLibrary.pm t/lib/DBICTest/Schema/CD.pm t/lib/DBICTest/Schema/CD_to_Producer.pm t/lib/DBICTest/Schema/Collection.pm t/lib/DBICTest/Schema/CollectionObject.pm t/lib/DBICTest/Schema/Dummy.pm t/lib/DBICTest/Schema/Employee.pm t/lib/DBICTest/Schema/Event.pm t/lib/DBICTest/Schema/EventTZ.pm t/lib/DBICTest/Schema/FileColumn.pm t/lib/DBICTest/Schema/ForceForeign.pm t/lib/DBICTest/Schema/FourKeys.pm t/lib/DBICTest/Schema/FourKeys_to_TwoKeys.pm t/lib/DBICTest/Schema/Genre.pm t/lib/DBICTest/Schema/Image.pm t/lib/DBICTest/Schema/LinerNotes.pm t/lib/DBICTest/Schema/Link.pm t/lib/DBICTest/Schema/LyricVersion.pm t/lib/DBICTest/Schema/Lyrics.pm t/lib/DBICTest/Schema/NoPrimaryKey.pm t/lib/DBICTest/Schema/NoSuchClass.pm t/lib/DBICTest/Schema/OneKey.pm t/lib/DBICTest/Schema/Owners.pm t/lib/DBICTest/Schema/Producer.pm t/lib/DBICTest/Schema/SelfRef.pm t/lib/DBICTest/Schema/SelfRefAlias.pm t/lib/DBICTest/Schema/SequenceTest.pm t/lib/DBICTest/Schema/Serialized.pm t/lib/DBICTest/Schema/Tag.pm t/lib/DBICTest/Schema/Track.pm t/lib/DBICTest/Schema/TreeLike.pm t/lib/DBICTest/Schema/TwoKeyTreeLike.pm t/lib/DBICTest/Schema/TwoKeys.pm t/lib/DBICTest/Schema/TypedObject.pm t/lib/DBICTest/Stats.pm t/lib/DBICTest/SyntaxErrorComponent1.pm t/lib/DBICTest/SyntaxErrorComponent2.pm t/lib/DBICTest/SyntaxErrorComponent3.pm t/lib/DBSchema.pm t/lib/DBSchema/Result/Address.pm t/lib/DBSchema/Result/Dvd.pm t/lib/DBSchema/Result/Dvdtag.pm t/lib/DBSchema/Result/KeysByMethod.pm t/lib/DBSchema/Result/LinerNotes.pm t/lib/DBSchema/Result/Onekey.pm t/lib/DBSchema/Result/Owner.pm t/lib/DBSchema/Result/Personality.pm t/lib/DBSchema/Result/Podcast.pm t/lib/DBSchema/Result/Role.pm t/lib/DBSchema/Result/Tag.pm t/lib/DBSchema/Result/Twokeys.pm t/lib/DBSchema/Result/Twokeys_belongsto.pm t/lib/DBSchema/Result/User.pm t/lib/DBSchema/Result/UserRole.pm t/lib/DBSchema/Result/Viewing.pm t/lib/DBSchemaBase.pm t/lib/DBSchemaMoose.pm t/lib/DBSchemaMoose/ResultSet.pm t/lib/DebugObject.pm t/lib/MySchema.pm t/lib/MySchema/Test.pm t/lib/TwoPkHasManyDB/Schema.pm t/lib/TwoPkHasManyDB/Schema/Result/Item.pm t/lib/TwoPkHasManyDB/Schema/Result/RelatedItem.pm t/lib/TwoPkHasManyDB/Schema/Result/RelatedItem2.pm t/lib/sqlite.sql t/pod-coverage.t t/pod.t t/release-dist-manifest.t t/release-has-version.t t/release-unused-vars.t t/twopk_has_many.t t/undef_pk.t t/update_introspectable_m2m.t t/var/dvdzbr.db t/var/placeholder t_dbic/lib/DBICTest.pm t_dbic/lib/DBICTest/BaseResult.pm t_dbic/lib/DBICTest/BaseResultSet.pm t_dbic/lib/DBICTest/BaseSchema.pm t_dbic/lib/DBICTest/Cursor.pm t_dbic/lib/DBICTest/DeployComponent.pm t_dbic/lib/DBICTest/ErrorComponent.pm t_dbic/lib/DBICTest/FakeComponent.pm t_dbic/lib/DBICTest/ForeignComponent.pm t_dbic/lib/DBICTest/ForeignComponent/TestComp.pm t_dbic/lib/DBICTest/OptionalComponent.pm t_dbic/lib/DBICTest/ResultSetManager.pm t_dbic/lib/DBICTest/ResultSetManager/Foo.pm t_dbic/lib/DBICTest/RunMode.pm t_dbic/lib/DBICTest/Schema.pm t_dbic/lib/DBICTest/Schema/Artist.pm t_dbic/lib/DBICTest/Schema/ArtistGUID.pm t_dbic/lib/DBICTest/Schema/ArtistSourceName.pm t_dbic/lib/DBICTest/Schema/ArtistSubclass.pm t_dbic/lib/DBICTest/Schema/ArtistUndirectedMap.pm t_dbic/lib/DBICTest/Schema/Artwork.pm t_dbic/lib/DBICTest/Schema/Artwork_to_Artist.pm t_dbic/lib/DBICTest/Schema/BindType.pm t_dbic/lib/DBICTest/Schema/Bookmark.pm t_dbic/lib/DBICTest/Schema/BooksInLibrary.pm t_dbic/lib/DBICTest/Schema/CD.pm t_dbic/lib/DBICTest/Schema/CD_to_Producer.pm t_dbic/lib/DBICTest/Schema/Collection.pm t_dbic/lib/DBICTest/Schema/CollectionObject.pm t_dbic/lib/DBICTest/Schema/ComputedColumn.pm t_dbic/lib/DBICTest/Schema/CustomSql.pm t_dbic/lib/DBICTest/Schema/Dummy.pm t_dbic/lib/DBICTest/Schema/Employee.pm t_dbic/lib/DBICTest/Schema/Encoded.pm t_dbic/lib/DBICTest/Schema/Event.pm t_dbic/lib/DBICTest/Schema/EventSmallDT.pm t_dbic/lib/DBICTest/Schema/EventTZ.pm t_dbic/lib/DBICTest/Schema/EventTZDeprecated.pm t_dbic/lib/DBICTest/Schema/EventTZPg.pm t_dbic/lib/DBICTest/Schema/ForceForeign.pm t_dbic/lib/DBICTest/Schema/FourKeys.pm t_dbic/lib/DBICTest/Schema/FourKeys_to_TwoKeys.pm t_dbic/lib/DBICTest/Schema/Genre.pm t_dbic/lib/DBICTest/Schema/Image.pm t_dbic/lib/DBICTest/Schema/LinerNotes.pm t_dbic/lib/DBICTest/Schema/Link.pm t_dbic/lib/DBICTest/Schema/LyricVersion.pm t_dbic/lib/DBICTest/Schema/Lyrics.pm t_dbic/lib/DBICTest/Schema/Money.pm t_dbic/lib/DBICTest/Schema/NoPrimaryKey.pm t_dbic/lib/DBICTest/Schema/NoSuchClass.pm t_dbic/lib/DBICTest/Schema/OneKey.pm t_dbic/lib/DBICTest/Schema/Owners.pm t_dbic/lib/DBICTest/Schema/Producer.pm t_dbic/lib/DBICTest/Schema/PunctuatedColumnName.pm t_dbic/lib/DBICTest/Schema/SelfRef.pm t_dbic/lib/DBICTest/Schema/SelfRefAlias.pm t_dbic/lib/DBICTest/Schema/SequenceTest.pm t_dbic/lib/DBICTest/Schema/Serialized.pm t_dbic/lib/DBICTest/Schema/Tag.pm t_dbic/lib/DBICTest/Schema/TimestampPrimaryKey.pm t_dbic/lib/DBICTest/Schema/Track.pm t_dbic/lib/DBICTest/Schema/TreeLike.pm t_dbic/lib/DBICTest/Schema/TwoKeyTreeLike.pm t_dbic/lib/DBICTest/Schema/TwoKeys.pm t_dbic/lib/DBICTest/Schema/TypedObject.pm t_dbic/lib/DBICTest/Schema/VaryingMAX.pm t_dbic/lib/DBICTest/Schema/Year1999CDs.pm t_dbic/lib/DBICTest/Schema/Year2000CDs.pm t_dbic/lib/DBICTest/Stats.pm t_dbic/lib/DBICTest/SyntaxErrorComponent1.pm t_dbic/lib/DBICTest/SyntaxErrorComponent2.pm t_dbic/lib/DBICTest/SyntaxErrorComponent3.pm t_dbic/lib/DBICTest/Taint/Classes/Auto.pm t_dbic/lib/DBICTest/Taint/Classes/Manual.pm t_dbic/lib/DBICTest/Taint/Namespaces/Result/Test.pm t_dbic/lib/DBICTest/Util.pm t_dbic/lib/DBICTest/Util/OverrideRequire.pm t_dbic/lib/sqlite.sql t_dbic/might_have.t DBIx-Class-ResultSet-RecursiveUpdate-0.40/META.json0000644000175000017500000003726213556035664022134 0ustar ahartmaiahartmai{ "abstract" : "like update_or_create - but recursive", "author" : [ "Zbigniew Lukasiak ", "John Napiorkowski ", "Alexander Hartmaier ", "Gerda Shank " ], "dynamic_config" : 0, "generated_by" : "Dist::Zilla version 6.012, CPAN::Meta::Converter version 2.150010", "license" : [ "perl_5" ], "meta-spec" : { "url" : "http://search.cpan.org/perldoc?CPAN::Meta::Spec", "version" : 2 }, "name" : "DBIx-Class-ResultSet-RecursiveUpdate", "no_index" : { "directory" : [ "t/lib", "t_dbic/lib" ] }, "prereqs" : { "configure" : { "requires" : { "ExtUtils::MakeMaker" : "0" } }, "develop" : { "requires" : { "Pod::Coverage::TrustPod" : "0", "Test::More" : "0", "Test::Pod" : "1.41", "Test::Pod::Coverage" : "1.08", "Test::Portability::Files" : "0", "Test::Synopsis" : "0" } }, "runtime" : { "requires" : { "Carp::Clan" : "6.04", "DBD::SQLite" : "1.21", "DBIx::Class" : "0.08103", "DBIx::Class::IntrospectableM2M" : "0", "Data::Dumper::Concise" : "2.020", "DateTime" : "0", "List::MoreUtils" : "0.22", "SQL::Translator" : "0.11016", "Try::Tiny" : "0.30" } }, "test" : { "requires" : { "Test::DBIC::ExpectedQueries" : "2.001", "Test::More" : "0.88", "Test::Trap" : "v0.2.2", "Test::Warn" : "0.20" } } }, "release_status" : "stable", "resources" : { "repository" : { "type" : "git", "url" : "git://github.com/gshank/dbix-class-resultset-recursiveupdate", "web" : "http://github.com/gshank/dbix-class-resultset-recursiveupdate" } }, "version" : "0.40", "x_Dist_Zilla" : { "perl" : { "version" : "5.030000" }, "plugins" : [ { "class" : "Dist::Zilla::Plugin::GatherDir", "config" : { "Dist::Zilla::Plugin::GatherDir" : { "exclude_filename" : [], "exclude_match" : [], "follow_symlinks" : 0, "include_dotfiles" : 0, "prefix" : "", "prune_directory" : [], "root" : "." } }, "name" : "@Basic/GatherDir", "version" : "6.012" }, { "class" : "Dist::Zilla::Plugin::PruneCruft", "name" : "@Basic/PruneCruft", "version" : "6.012" }, { "class" : "Dist::Zilla::Plugin::ManifestSkip", "name" : "@Basic/ManifestSkip", "version" : "6.012" }, { "class" : "Dist::Zilla::Plugin::MetaYAML", "name" : "@Basic/MetaYAML", "version" : "6.012" }, { "class" : "Dist::Zilla::Plugin::License", "name" : "@Basic/License", "version" : "6.012" }, { "class" : "Dist::Zilla::Plugin::Readme", "name" : "@Basic/Readme", "version" : "6.012" }, { "class" : "Dist::Zilla::Plugin::ExtraTests", "name" : "@Basic/ExtraTests", "version" : "6.012" }, { "class" : "Dist::Zilla::Plugin::ExecDir", "name" : "@Basic/ExecDir", "version" : "6.012" }, { "class" : "Dist::Zilla::Plugin::ShareDir", "name" : "@Basic/ShareDir", "version" : "6.012" }, { "class" : "Dist::Zilla::Plugin::MakeMaker", "config" : { "Dist::Zilla::Role::TestRunner" : { "default_jobs" : 1 } }, "name" : "@Basic/MakeMaker", "version" : "6.012" }, { "class" : "Dist::Zilla::Plugin::Manifest", "name" : "@Basic/Manifest", "version" : "6.012" }, { "class" : "Dist::Zilla::Plugin::TestRelease", "name" : "@Basic/TestRelease", "version" : "6.012" }, { "class" : "Dist::Zilla::Plugin::ConfirmRelease", "name" : "@Basic/ConfirmRelease", "version" : "6.012" }, { "class" : "Dist::Zilla::Plugin::UploadToCPAN", "name" : "@Basic/UploadToCPAN", "version" : "6.012" }, { "class" : "Dist::Zilla::Plugin::PodWeaver", "config" : { "Dist::Zilla::Plugin::PodWeaver" : { "finder" : [ ":InstallModules", ":ExecFiles" ], "plugins" : [ { "class" : "Pod::Weaver::Plugin::EnsurePod5", "name" : "@CorePrep/EnsurePod5", "version" : "4.015" }, { "class" : "Pod::Weaver::Plugin::H1Nester", "name" : "@CorePrep/H1Nester", "version" : "4.015" }, { "class" : "Pod::Weaver::Plugin::SingleEncoding", "name" : "@Default/SingleEncoding", "version" : "4.015" }, { "class" : "Pod::Weaver::Section::Name", "name" : "@Default/Name", "version" : "4.015" }, { "class" : "Pod::Weaver::Section::Version", "name" : "@Default/Version", "version" : "4.015" }, { "class" : "Pod::Weaver::Section::Region", "name" : "@Default/prelude", "version" : "4.015" }, { "class" : "Pod::Weaver::Section::Generic", "name" : "SYNOPSIS", "version" : "4.015" }, { "class" : "Pod::Weaver::Section::Generic", "name" : "DESCRIPTION", "version" : "4.015" }, { "class" : "Pod::Weaver::Section::Generic", "name" : "OVERVIEW", "version" : "4.015" }, { "class" : "Pod::Weaver::Section::Collect", "name" : "ATTRIBUTES", "version" : "4.015" }, { "class" : "Pod::Weaver::Section::Collect", "name" : "METHODS", "version" : "4.015" }, { "class" : "Pod::Weaver::Section::Collect", "name" : "FUNCTIONS", "version" : "4.015" }, { "class" : "Pod::Weaver::Section::Leftovers", "name" : "@Default/Leftovers", "version" : "4.015" }, { "class" : "Pod::Weaver::Section::Region", "name" : "@Default/postlude", "version" : "4.015" }, { "class" : "Pod::Weaver::Section::Authors", "name" : "@Default/Authors", "version" : "4.015" }, { "class" : "Pod::Weaver::Section::Legal", "name" : "@Default/Legal", "version" : "4.015" } ] } }, "name" : "PodWeaver", "version" : "4.008" }, { "class" : "Dist::Zilla::Plugin::PkgVersion", "name" : "PkgVersion", "version" : "6.012" }, { "class" : "Dist::Zilla::Plugin::NextRelease", "name" : "NextRelease", "version" : "6.012" }, { "class" : "Dist::Zilla::Plugin::MetaConfig", "name" : "MetaConfig", "version" : "6.012" }, { "class" : "Dist::Zilla::Plugin::MetaJSON", "name" : "MetaJSON", "version" : "6.012" }, { "class" : "Dist::Zilla::Plugin::MetaNoIndex", "name" : "MetaNoIndex", "version" : "6.012" }, { "class" : "Dist::Zilla::Plugin::MetaResources", "name" : "MetaResources", "version" : "6.012" }, { "class" : "Dist::Zilla::Plugin::PodSyntaxTests", "name" : "PodSyntaxTests", "version" : "6.012" }, { "class" : "Dist::Zilla::Plugin::PodCoverageTests", "name" : "PodCoverageTests", "version" : "6.012" }, { "class" : "Dist::Zilla::Plugin::Test::Portability", "config" : { "Dist::Zilla::Plugin::Test::Portability" : { "options" : "" } }, "name" : "Test::Portability", "version" : "2.001000" }, { "class" : "Dist::Zilla::Plugin::Test::DistManifest", "name" : "Test::DistManifest", "version" : "2.000005" }, { "class" : "Dist::Zilla::Plugin::Test::Synopsis", "name" : "Test::Synopsis", "version" : "2.000007" }, { "class" : "Dist::Zilla::Plugin::Test::UnusedVars", "name" : "Test::UnusedVars", "version" : "2.000007" }, { "class" : "Dist::Zilla::Plugin::HasVersionTests", "name" : "HasVersionTests", "version" : "1.101420" }, { "class" : "Dist::Zilla::Plugin::Git::Check", "config" : { "Dist::Zilla::Plugin::Git::Check" : { "untracked_files" : "die" }, "Dist::Zilla::Role::Git::DirtyFiles" : { "allow_dirty" : [ "Changes", "dist.ini" ], "allow_dirty_match" : [], "changelog" : "Changes" }, "Dist::Zilla::Role::Git::Repo" : { "git_version" : "2.20.1", "repo_root" : "." } }, "name" : "@Git/Check", "version" : "2.046" }, { "class" : "Dist::Zilla::Plugin::Git::Commit", "config" : { "Dist::Zilla::Plugin::Git::Commit" : { "add_files_in" : [], "commit_msg" : "version %v%n%n%c" }, "Dist::Zilla::Role::Git::DirtyFiles" : { "allow_dirty" : [ "Changes", "dist.ini" ], "allow_dirty_match" : [], "changelog" : "Changes" }, "Dist::Zilla::Role::Git::Repo" : { "git_version" : "2.20.1", "repo_root" : "." }, "Dist::Zilla::Role::Git::StringFormatter" : { "time_zone" : "local" } }, "name" : "@Git/Commit", "version" : "2.046" }, { "class" : "Dist::Zilla::Plugin::Git::Tag", "config" : { "Dist::Zilla::Plugin::Git::Tag" : { "branch" : null, "changelog" : "Changes", "signed" : 0, "tag" : "0.40", "tag_format" : "%v", "tag_message" : "%v" }, "Dist::Zilla::Role::Git::Repo" : { "git_version" : "2.20.1", "repo_root" : "." }, "Dist::Zilla::Role::Git::StringFormatter" : { "time_zone" : "local" } }, "name" : "@Git/Tag", "version" : "2.046" }, { "class" : "Dist::Zilla::Plugin::Git::Push", "config" : { "Dist::Zilla::Plugin::Git::Push" : { "push_to" : [ "origin" ], "remotes_must_exist" : 1 }, "Dist::Zilla::Role::Git::Repo" : { "git_version" : "2.20.1", "repo_root" : "." } }, "name" : "@Git/Push", "version" : "2.046" }, { "class" : "Dist::Zilla::Plugin::Prereqs", "config" : { "Dist::Zilla::Plugin::Prereqs" : { "phase" : "runtime", "type" : "requires" } }, "name" : "Prereqs", "version" : "6.012" }, { "class" : "Dist::Zilla::Plugin::Prereqs", "config" : { "Dist::Zilla::Plugin::Prereqs" : { "phase" : "test", "type" : "requires" } }, "name" : "TestRequires", "version" : "6.012" }, { "class" : "Dist::Zilla::Plugin::FinderCode", "name" : ":InstallModules", "version" : "6.012" }, { "class" : "Dist::Zilla::Plugin::FinderCode", "name" : ":IncModules", "version" : "6.012" }, { "class" : "Dist::Zilla::Plugin::FinderCode", "name" : ":TestFiles", "version" : "6.012" }, { "class" : "Dist::Zilla::Plugin::FinderCode", "name" : ":ExtraTestFiles", "version" : "6.012" }, { "class" : "Dist::Zilla::Plugin::FinderCode", "name" : ":ExecFiles", "version" : "6.012" }, { "class" : "Dist::Zilla::Plugin::FinderCode", "name" : ":PerlExecFiles", "version" : "6.012" }, { "class" : "Dist::Zilla::Plugin::FinderCode", "name" : ":ShareFiles", "version" : "6.012" }, { "class" : "Dist::Zilla::Plugin::FinderCode", "name" : ":MainModule", "version" : "6.012" }, { "class" : "Dist::Zilla::Plugin::FinderCode", "name" : ":AllFiles", "version" : "6.012" }, { "class" : "Dist::Zilla::Plugin::FinderCode", "name" : ":NoFiles", "version" : "6.012" } ], "zilla" : { "class" : "Dist::Zilla::Dist::Builder", "config" : { "is_trial" : 0 }, "version" : "6.012" } }, "x_generated_by_perl" : "v5.30.0", "x_serialization_backend" : "Cpanel::JSON::XS version 4.12" } DBIx-Class-ResultSet-RecursiveUpdate-0.40/Makefile.PL0000644000175000017500000000353513556035664022461 0ustar ahartmaiahartmai# This file was automatically generated by Dist::Zilla::Plugin::MakeMaker v6.012. use strict; use warnings; use ExtUtils::MakeMaker; my %WriteMakefileArgs = ( "ABSTRACT" => "like update_or_create - but recursive", "AUTHOR" => "Zbigniew Lukasiak , John Napiorkowski , Alexander Hartmaier , Gerda Shank ", "CONFIGURE_REQUIRES" => { "ExtUtils::MakeMaker" => 0 }, "DISTNAME" => "DBIx-Class-ResultSet-RecursiveUpdate", "LICENSE" => "perl", "NAME" => "DBIx::Class::ResultSet::RecursiveUpdate", "PREREQ_PM" => { "Carp::Clan" => "6.04", "DBD::SQLite" => "1.21", "DBIx::Class" => "0.08103", "DBIx::Class::IntrospectableM2M" => 0, "Data::Dumper::Concise" => "2.020", "DateTime" => 0, "List::MoreUtils" => "0.22", "SQL::Translator" => "0.11016", "Try::Tiny" => "0.30" }, "TEST_REQUIRES" => { "Test::DBIC::ExpectedQueries" => "2.001", "Test::More" => "0.88", "Test::Trap" => "0.2.2", "Test::Warn" => "0.20" }, "VERSION" => "0.40", "test" => { "TESTS" => "t/*.t" } ); my %FallbackPrereqs = ( "Carp::Clan" => "6.04", "DBD::SQLite" => "1.21", "DBIx::Class" => "0.08103", "DBIx::Class::IntrospectableM2M" => 0, "Data::Dumper::Concise" => "2.020", "DateTime" => 0, "List::MoreUtils" => "0.22", "SQL::Translator" => "0.11016", "Test::DBIC::ExpectedQueries" => "2.001", "Test::More" => "0.88", "Test::Trap" => "0.2.2", "Test::Warn" => "0.20", "Try::Tiny" => "0.30" ); unless ( eval { ExtUtils::MakeMaker->VERSION(6.63_03) } ) { delete $WriteMakefileArgs{TEST_REQUIRES}; delete $WriteMakefileArgs{BUILD_REQUIRES}; $WriteMakefileArgs{PREREQ_PM} = \%FallbackPrereqs; } delete $WriteMakefileArgs{CONFIGURE_REQUIRES} unless eval { ExtUtils::MakeMaker->VERSION(6.52) }; WriteMakefile(%WriteMakefileArgs); DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/0000775000175000017500000000000013556035664021727 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/might_have.t0000644000175000017500000000165513556035664024234 0ustar ahartmaiahartmaiuse strict; use warnings; use Test::More; use Test::Exception; use lib qw(t_dbic/lib); use DBICTest; my $schema = DBICTest->init_schema(); # create a track with a 'cd_single' (might_have) my $track_id; lives_ok ( sub { my $cd = $schema->resultset('CD')->first; my $track = $schema->resultset('Track')->create ({ cd => $cd, title => 'Multicreate rocks', cd_single => { artist => $cd->artist, year => 2008, title => 'Disemboweling MultiCreate', tracks => [ { title => 'Why does mst write this way' }, { title => 'Chainsaw celebration' }, { title => 'Purl cleans up' }, ], }, }); isa_ok ($track, 'DBICTest::Track', 'Main Track object created'); $track_id = $track->id; is ($track->title, 'Multicreate rocks', 'Correct Track title'); my $single = $track->cd_single; isa_ok ($single, 'DBICTest::CD', 'Created a single with the track'); }); done_testing; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/0000775000175000017500000000000013556035664022475 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/sqlite.sql0000644000175000017500000002154113556035664024520 0ustar ahartmaiahartmai-- -- Created by SQL::Translator::Producer::SQLite -- Created on Fri Mar 2 18:22:33 2012 -- -- -- Table: artist -- CREATE TABLE artist ( artistid INTEGER PRIMARY KEY NOT NULL, name varchar(100), rank integer NOT NULL DEFAULT 13, charfield char(10) ); CREATE INDEX artist_name_hookidx ON artist (name); CREATE UNIQUE INDEX artist_name ON artist (name); CREATE UNIQUE INDEX u_nullable ON artist (charfield, rank); -- -- Table: bindtype_test -- CREATE TABLE bindtype_test ( id INTEGER PRIMARY KEY NOT NULL, bytea blob, blob blob, clob clob, a_memo memo ); -- -- Table: collection -- CREATE TABLE collection ( collectionid INTEGER PRIMARY KEY NOT NULL, name varchar(100) NOT NULL ); -- -- Table: encoded -- CREATE TABLE encoded ( id INTEGER PRIMARY KEY NOT NULL, encoded varchar(100) ); -- -- Table: event -- CREATE TABLE event ( id INTEGER PRIMARY KEY NOT NULL, starts_at date NOT NULL, created_on timestamp NOT NULL, varchar_date varchar(20), varchar_datetime varchar(20), skip_inflation datetime, ts_without_tz datetime ); -- -- Table: fourkeys -- CREATE TABLE fourkeys ( foo integer NOT NULL, bar integer NOT NULL, hello integer NOT NULL, goodbye integer NOT NULL, sensors character(10) NOT NULL, read_count int, PRIMARY KEY (foo, bar, hello, goodbye) ); -- -- Table: genre -- CREATE TABLE genre ( genreid INTEGER PRIMARY KEY NOT NULL, name varchar(100) NOT NULL ); CREATE UNIQUE INDEX genre_name ON genre (name); -- -- Table: link -- CREATE TABLE link ( id INTEGER PRIMARY KEY NOT NULL, url varchar(100), title varchar(100) ); -- -- Table: money_test -- CREATE TABLE money_test ( id INTEGER PRIMARY KEY NOT NULL, amount money ); -- -- Table: noprimarykey -- CREATE TABLE noprimarykey ( foo integer NOT NULL, bar integer NOT NULL, baz integer NOT NULL ); CREATE UNIQUE INDEX foo_bar ON noprimarykey (foo, bar); -- -- Table: onekey -- CREATE TABLE onekey ( id INTEGER PRIMARY KEY NOT NULL, artist integer NOT NULL, cd integer NOT NULL ); -- -- Table: owners -- CREATE TABLE owners ( id INTEGER PRIMARY KEY NOT NULL, name varchar(100) NOT NULL ); CREATE UNIQUE INDEX owners_name ON owners (name); -- -- Table: producer -- CREATE TABLE producer ( producerid INTEGER PRIMARY KEY NOT NULL, name varchar(100) NOT NULL ); CREATE UNIQUE INDEX prod_name ON producer (name); -- -- Table: self_ref -- CREATE TABLE self_ref ( id INTEGER PRIMARY KEY NOT NULL, name varchar(100) NOT NULL ); -- -- Table: sequence_test -- CREATE TABLE sequence_test ( pkid1 integer NOT NULL, pkid2 integer NOT NULL, nonpkid integer NOT NULL, name varchar(100), PRIMARY KEY (pkid1, pkid2) ); -- -- Table: serialized -- CREATE TABLE serialized ( id INTEGER PRIMARY KEY NOT NULL, serialized text NOT NULL ); -- -- Table: timestamp_primary_key_test -- CREATE TABLE timestamp_primary_key_test ( id timestamp NOT NULL DEFAULT current_timestamp, PRIMARY KEY (id) ); -- -- Table: treelike -- CREATE TABLE treelike ( id INTEGER PRIMARY KEY NOT NULL, parent integer, name varchar(100) NOT NULL ); CREATE INDEX treelike_idx_parent ON treelike (parent); -- -- Table: twokeytreelike -- CREATE TABLE twokeytreelike ( id1 integer NOT NULL, id2 integer NOT NULL, parent1 integer NOT NULL, parent2 integer NOT NULL, name varchar(100) NOT NULL, PRIMARY KEY (id1, id2) ); CREATE INDEX twokeytreelike_idx_parent1_parent2 ON twokeytreelike (parent1, parent2); CREATE UNIQUE INDEX tktlnameunique ON twokeytreelike (name); -- -- Table: typed_object -- CREATE TABLE typed_object ( objectid INTEGER PRIMARY KEY NOT NULL, type varchar(100) NOT NULL, value varchar(100) NOT NULL ); -- -- Table: artist_undirected_map -- CREATE TABLE artist_undirected_map ( id1 integer NOT NULL, id2 integer NOT NULL, PRIMARY KEY (id1, id2) ); CREATE INDEX artist_undirected_map_idx_id1 ON artist_undirected_map (id1); CREATE INDEX artist_undirected_map_idx_id2 ON artist_undirected_map (id2); -- -- Table: bookmark -- CREATE TABLE bookmark ( id INTEGER PRIMARY KEY NOT NULL, link integer ); CREATE INDEX bookmark_idx_link ON bookmark (link); -- -- Table: books -- CREATE TABLE books ( id INTEGER PRIMARY KEY NOT NULL, source varchar(100) NOT NULL, owner integer NOT NULL, title varchar(100) NOT NULL, price integer ); CREATE INDEX books_idx_owner ON books (owner); CREATE UNIQUE INDEX books_title ON books (title); -- -- Table: employee -- CREATE TABLE employee ( employee_id INTEGER PRIMARY KEY NOT NULL, position integer NOT NULL, group_id integer, group_id_2 integer, group_id_3 integer, name varchar(100), encoded integer ); CREATE INDEX employee_idx_encoded ON employee (encoded); -- -- Table: forceforeign -- CREATE TABLE forceforeign ( artist INTEGER PRIMARY KEY NOT NULL, cd integer NOT NULL ); -- -- Table: self_ref_alias -- CREATE TABLE self_ref_alias ( self_ref integer NOT NULL, alias integer NOT NULL, PRIMARY KEY (self_ref, alias) ); CREATE INDEX self_ref_alias_idx_alias ON self_ref_alias (alias); CREATE INDEX self_ref_alias_idx_self_ref ON self_ref_alias (self_ref); -- -- Table: track -- CREATE TABLE track ( trackid INTEGER PRIMARY KEY NOT NULL, cd integer NOT NULL, position int NOT NULL, title varchar(100) NOT NULL, last_updated_on datetime, last_updated_at datetime ); CREATE INDEX track_idx_cd ON track (cd); CREATE UNIQUE INDEX track_cd_position ON track (cd, position); CREATE UNIQUE INDEX track_cd_title ON track (cd, title); -- -- Table: cd -- CREATE TABLE cd ( cdid INTEGER PRIMARY KEY NOT NULL, artist integer NOT NULL, title varchar(100) NOT NULL, year varchar(100) NOT NULL, genreid integer, single_track integer ); CREATE INDEX cd_idx_artist ON cd (artist); CREATE INDEX cd_idx_genreid ON cd (genreid); CREATE INDEX cd_idx_single_track ON cd (single_track); CREATE UNIQUE INDEX cd_artist_title ON cd (artist, title); -- -- Table: collection_object -- CREATE TABLE collection_object ( collection integer NOT NULL, object integer NOT NULL, PRIMARY KEY (collection, object) ); CREATE INDEX collection_object_idx_collection ON collection_object (collection); CREATE INDEX collection_object_idx_object ON collection_object (object); -- -- Table: lyrics -- CREATE TABLE lyrics ( lyric_id INTEGER PRIMARY KEY NOT NULL, track_id integer NOT NULL ); CREATE INDEX lyrics_idx_track_id ON lyrics (track_id); -- -- Table: cd_artwork -- CREATE TABLE cd_artwork ( cd_id INTEGER PRIMARY KEY NOT NULL ); -- -- Table: liner_notes -- CREATE TABLE liner_notes ( liner_id INTEGER PRIMARY KEY NOT NULL, notes varchar(100) NOT NULL ); -- -- Table: lyric_versions -- CREATE TABLE lyric_versions ( id INTEGER PRIMARY KEY NOT NULL, lyric_id integer NOT NULL, text varchar(100) NOT NULL ); CREATE INDEX lyric_versions_idx_lyric_id ON lyric_versions (lyric_id); -- -- Table: tags -- CREATE TABLE tags ( tagid INTEGER PRIMARY KEY NOT NULL, cd integer NOT NULL, tag varchar(100) NOT NULL ); CREATE INDEX tags_idx_cd ON tags (cd); CREATE UNIQUE INDEX tagid_cd ON tags (tagid, cd); CREATE UNIQUE INDEX tagid_cd_tag ON tags (tagid, cd, tag); CREATE UNIQUE INDEX tags_tagid_tag ON tags (tagid, tag); CREATE UNIQUE INDEX tags_tagid_tag_cd ON tags (tagid, tag, cd); -- -- Table: cd_to_producer -- CREATE TABLE cd_to_producer ( cd integer NOT NULL, producer integer NOT NULL, attribute integer, PRIMARY KEY (cd, producer) ); CREATE INDEX cd_to_producer_idx_cd ON cd_to_producer (cd); CREATE INDEX cd_to_producer_idx_producer ON cd_to_producer (producer); -- -- Table: images -- CREATE TABLE images ( id INTEGER PRIMARY KEY NOT NULL, artwork_id integer NOT NULL, name varchar(100) NOT NULL, data blob ); CREATE INDEX images_idx_artwork_id ON images (artwork_id); -- -- Table: twokeys -- CREATE TABLE twokeys ( artist integer NOT NULL, cd integer NOT NULL, PRIMARY KEY (artist, cd) ); CREATE INDEX twokeys_idx_artist ON twokeys (artist); -- -- Table: artwork_to_artist -- CREATE TABLE artwork_to_artist ( artwork_cd_id integer NOT NULL, artist_id integer NOT NULL, PRIMARY KEY (artwork_cd_id, artist_id) ); CREATE INDEX artwork_to_artist_idx_artist_id ON artwork_to_artist (artist_id); CREATE INDEX artwork_to_artist_idx_artwork_cd_id ON artwork_to_artist (artwork_cd_id); -- -- Table: fourkeys_to_twokeys -- CREATE TABLE fourkeys_to_twokeys ( f_foo integer NOT NULL, f_bar integer NOT NULL, f_hello integer NOT NULL, f_goodbye integer NOT NULL, t_artist integer NOT NULL, t_cd integer NOT NULL, autopilot character NOT NULL, pilot_sequence integer, PRIMARY KEY (f_foo, f_bar, f_hello, f_goodbye, t_artist, t_cd) ); CREATE INDEX fourkeys_to_twokeys_idx_f_foo_f_bar_f_hello_f_goodbye ON fourkeys_to_twokeys (f_foo, f_bar, f_hello, f_goodbye); CREATE INDEX fourkeys_to_twokeys_idx_t_artist_t_cd ON fourkeys_to_twokeys (t_artist, t_cd); -- -- View: year2000cds -- CREATE VIEW year2000cds AS SELECT cdid, artist, title, year, genreid, single_track FROM cd WHERE year = "2000"; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest.pm0000644000175000017500000003210713556035664024375 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest; use strict; use warnings; use DBICTest::RunMode; use DBICTest::Schema; use DBICTest::Util qw/populate_weakregistry assert_empty_weakregistry local_umask/; use Carp; use Path::Class::File (); use File::Spec; use Fcntl qw/:flock/; =head1 NAME DBICTest - Library to be used by DBIx::Class test scripts. =head1 SYNOPSIS use lib qw(t/lib); use DBICTest; use Test::More; my $schema = DBICTest->init_schema(); =head1 DESCRIPTION This module provides the basic utilities to write tests against DBIx::Class. =head1 METHODS =head2 init_schema my $schema = DBICTest->init_schema( no_deploy=>1, no_populate=>1, storage_type=>'::DBI::Replicated', storage_type_args=>{ balancer_type=>'DBIx::Class::Storage::DBI::Replicated::Balancer::Random' }, ); This method removes the test SQLite database in t/var/DBIxClass.db and then creates a new, empty database. This method will call deploy_schema() by default, unless the no_deploy flag is set. Also, by default, this method will call populate_schema() by default, unless the no_deploy or no_populate flags are set. =cut # some tests are very time sensitive and need to run on their own, without # being disturbed by anything else grabbing CPU or disk IO. Hence why everything # using DBICTest grabs a shared lock, and the few tests that request a :GlobalLock # will ask for an exclusive one and block until they can get it our ($global_lock_fh, $global_exclusive_lock); sub import { my $self = shift; my $lockpath = DBICTest::RunMode->tmpdir->file('.dbictest_global.lock'); { my $u = local_umask(0); # so that the file opens as 666, and any user can lock open ($global_lock_fh, '>', $lockpath) or die "Unable to open $lockpath: $!"; } for (@_) { if ($_ eq ':GlobalLock') { flock ($global_lock_fh, LOCK_EX) or die "Unable to lock $lockpath: $!"; $global_exclusive_lock = 1; } else { croak "Unknown export $_ requested from $self"; } } unless ($global_exclusive_lock) { flock ($global_lock_fh, LOCK_SH) or die "Unable to lock $lockpath: $!"; } } END { if ($global_lock_fh) { # delay destruction even more } } { my $dir = Path::Class::File->new(__FILE__)->dir->parent->subdir('var'); $dir->mkpath unless -d "$dir"; $dir = "$dir"; sub _sqlite_dbfilename { my $holder = $ENV{DBICTEST_LOCK_HOLDER} || $$; $holder = $$ if $holder == -1; # useful for missing cleanup debugging #if ( $holder == $$) { # my $x = $0; # $x =~ s/\//#/g; # $holder .= "-$x"; #} return "$dir/DBIxClass-$holder.db"; } END { _cleanup_dbfile(); } } $SIG{INT} = sub { _cleanup_dbfile(); exit 1 }; sub _cleanup_dbfile { # cleanup if this is us if ( ! $ENV{DBICTEST_LOCK_HOLDER} or $ENV{DBICTEST_LOCK_HOLDER} == -1 or $ENV{DBICTEST_LOCK_HOLDER} == $$ ) { my $db_file = _sqlite_dbfilename(); unlink $_ for ($db_file, "${db_file}-journal"); } } sub has_custom_dsn { return $ENV{"DBICTEST_DSN"} ? 1:0; } sub _sqlite_dbname { my $self = shift; my %args = @_; return $self->_sqlite_dbfilename if ( defined $args{sqlite_use_file} ? $args{sqlite_use_file} : $ENV{'DBICTEST_SQLITE_USE_FILE'} ); return ":memory:"; } sub _database { my $self = shift; my %args = @_; if ($ENV{DBICTEST_DSN}) { return ( (map { $ENV{"DBICTEST_${_}"} || '' } qw/DSN DBUSER DBPASS/), { AutoCommit => 1, %args }, ); } my $db_file = $self->_sqlite_dbname(%args); for ($db_file, "${db_file}-journal") { next unless -e $_; unlink ($_) or carp ( "Unable to unlink existing test database file $_ ($!), creation of fresh database / further tests may fail!" ); } return ("dbi:SQLite:${db_file}", '', '', { AutoCommit => 1, # this is executed on every connect, and thus installs a disconnect/DESTROY # guard for every new $dbh on_connect_do => sub { my $storage = shift; my $dbh = $storage->_get_dbh; # no fsync on commit $dbh->do ('PRAGMA synchronous = OFF'); # set a *DBI* disconnect callback, to make sure the physical SQLite # file is still there (i.e. the test does not attempt to delete # an open database, which fails on Win32) if (my $guard_cb = __mk_disconnect_guard($db_file)) { $dbh->{Callbacks} = { connect => sub { $guard_cb->('connect') }, disconnect => sub { $guard_cb->('disconnect') }, DESTROY => sub { $guard_cb->('DESTROY') }, }; } }, %args, }); } sub __mk_disconnect_guard { return if DBIx::Class::_ENV_::PEEPEENESS(); # leaks handles, delaying DESTROY, can't work right my $db_file = shift; return unless -f $db_file; my $orig_inode = (stat($db_file))[1] or return; my $clan_connect_caller = '*UNKNOWN*'; my $i; while ( my ($pack, $file, $line) = caller(++$i) ) { next if $file eq __FILE__; next if $pack =~ /^DBIx::Class|^Try::Tiny/; $clan_connect_caller = "$file line $line"; } my $failed_once = 0; my $connected = 1; return sub { return if $failed_once; my $event = shift; if ($event eq 'connect') { # this is necessary in case we are disconnected and connected again, all within the same $dbh object $connected = 1; return; } elsif ($event eq 'disconnect') { $connected = 0; } elsif ($event eq 'DESTROY' and ! $connected ) { return; } my $fail_reason; if (! -e $db_file) { $fail_reason = 'is missing'; } else { my $cur_inode = (stat($db_file))[1]; if ($orig_inode != $cur_inode) { # pack/unpack to match the unsigned longs returned by `stat` $fail_reason = sprintf 'was recreated (initially inode %s, now %s)', ( map { unpack ('L', pack ('l', $_) ) } ($orig_inode, $cur_inode ) ); } } if ($fail_reason) { $failed_once++; require Test::Builder; my $t = Test::Builder->new; local $Test::Builder::Level = $Test::Builder::Level + 3; $t->ok (0, "$db_file originally created at $clan_connect_caller $fail_reason before $event " . 'of DBI handle - a strong indicator that the database file was tampered with while ' . 'still being open. This action would fail massively if running under Win32, hence ' . 'we make sure it fails on any OS :)' ); } return; # this empty return is a DBI requirement }; } my $weak_registry = {}; sub init_schema { my $self = shift; my %args = @_; my $schema; if ($args{compose_connection}) { $schema = DBICTest::Schema->compose_connection( 'DBICTest', $self->_database(%args) ); } else { $schema = DBICTest::Schema->compose_namespace('DBICTest'); } if( $args{storage_type}) { $schema->storage_type($args{storage_type}); } if ( !$args{no_connect} ) { $schema = $schema->connect($self->_database(%args)); } if ( !$args{no_deploy} ) { __PACKAGE__->deploy_schema( $schema, $args{deploy_args} ); __PACKAGE__->populate_schema( $schema ) if( !$args{no_populate} ); } populate_weakregistry ( $weak_registry, $schema->storage ) if $INC{'Test/Builder.pm'} and $schema->storage; return $schema; } END { assert_empty_weakregistry($weak_registry, 'quiet'); } =head2 deploy_schema DBICTest->deploy_schema( $schema ); This method does one of two things to the schema. It can either call the experimental $schema->deploy() if the DBICTEST_SQLT_DEPLOY environment variable is set, otherwise the default is to read in the t/lib/sqlite.sql file and execute the SQL within. Either way you end up with a fresh set of tables for testing. =cut sub deploy_schema { my $self = shift; my $schema = shift; my $args = shift || {}; if ($ENV{"DBICTEST_SQLT_DEPLOY"}) { $schema->deploy($args); } else { my $filename = Path::Class::File->new(__FILE__)->dir ->file('sqlite.sql')->stringify; my $sql = do { local (@ARGV, $/) = $filename ; <> }; for my $chunk ( split (/;\s*\n+/, $sql) ) { if ( $chunk =~ / ^ (?! --\s* ) \S /xm ) { # there is some real sql in the chunk - a non-space at the start of the string which is not a comment $schema->storage->dbh_do(sub { $_[1]->do($chunk) }) or print "Error on SQL: $chunk\n"; } } } return; } =head2 populate_schema DBICTest->populate_schema( $schema ); After you deploy your schema you can use this method to populate the tables with test data. =cut sub populate_schema { my $self = shift; my $schema = shift; $schema->populate('Genre', [ [qw/genreid name/], [qw/1 emo /], ]); $schema->populate('Artist', [ [ qw/artistid name/ ], [ 1, 'Caterwauler McCrae' ], [ 2, 'Random Boy Band' ], [ 3, 'We Are Goth' ], ]); $schema->populate('CD', [ [ qw/cdid artist title year genreid/ ], [ 1, 1, "Spoonful of bees", 1999, 1 ], [ 2, 1, "Forkful of bees", 2001 ], [ 3, 1, "Caterwaulin' Blues", 1997 ], [ 4, 2, "Generic Manufactured Singles", 2001 ], [ 5, 3, "Come Be Depressed With Us", 1998 ], ]); $schema->populate('LinerNotes', [ [ qw/liner_id notes/ ], [ 2, "Buy Whiskey!" ], [ 4, "Buy Merch!" ], [ 5, "Kill Yourself!" ], ]); $schema->populate('Tag', [ [ qw/tagid cd tag/ ], [ 1, 1, "Blue" ], [ 2, 2, "Blue" ], [ 3, 3, "Blue" ], [ 4, 5, "Blue" ], [ 5, 2, "Cheesy" ], [ 6, 4, "Cheesy" ], [ 7, 5, "Cheesy" ], [ 8, 2, "Shiny" ], [ 9, 4, "Shiny" ], ]); $schema->populate('TwoKeys', [ [ qw/artist cd/ ], [ 1, 1 ], [ 1, 2 ], [ 2, 2 ], ]); $schema->populate('FourKeys', [ [ qw/foo bar hello goodbye sensors/ ], [ 1, 2, 3, 4, 'online' ], [ 5, 4, 3, 6, 'offline' ], ]); $schema->populate('OneKey', [ [ qw/id artist cd/ ], [ 1, 1, 1 ], [ 2, 1, 2 ], [ 3, 2, 2 ], ]); $schema->populate('SelfRef', [ [ qw/id name/ ], [ 1, 'First' ], [ 2, 'Second' ], ]); $schema->populate('SelfRefAlias', [ [ qw/self_ref alias/ ], [ 1, 2 ] ]); $schema->populate('ArtistUndirectedMap', [ [ qw/id1 id2/ ], [ 1, 2 ] ]); $schema->populate('Producer', [ [ qw/producerid name/ ], [ 1, 'Matt S Trout' ], [ 2, 'Bob The Builder' ], [ 3, 'Fred The Phenotype' ], ]); $schema->populate('CD_to_Producer', [ [ qw/cd producer/ ], [ 1, 1 ], [ 1, 2 ], [ 1, 3 ], ]); $schema->populate('TreeLike', [ [ qw/id parent name/ ], [ 1, undef, 'root' ], [ 2, 1, 'foo' ], [ 3, 2, 'bar' ], [ 6, 2, 'blop' ], [ 4, 3, 'baz' ], [ 5, 4, 'quux' ], [ 7, 3, 'fong' ], ]); $schema->populate('Track', [ [ qw/trackid cd position title/ ], [ 4, 2, 1, "Stung with Success"], [ 5, 2, 2, "Stripy"], [ 6, 2, 3, "Sticky Honey"], [ 7, 3, 1, "Yowlin"], [ 8, 3, 2, "Howlin"], [ 9, 3, 3, "Fowlin"], [ 10, 4, 1, "Boring Name"], [ 11, 4, 2, "Boring Song"], [ 12, 4, 3, "No More Ideas"], [ 13, 5, 1, "Sad"], [ 14, 5, 2, "Under The Weather"], [ 15, 5, 3, "Suicidal"], [ 16, 1, 1, "The Bees Knees"], [ 17, 1, 2, "Apiary"], [ 18, 1, 3, "Beehind You"], ]); $schema->populate('Event', [ [ qw/id starts_at created_on varchar_date varchar_datetime skip_inflation/ ], [ 1, '2006-04-25 22:24:33', '2006-06-22 21:00:05', '2006-07-23', '2006-05-22 19:05:07', '2006-04-21 18:04:06'], ]); $schema->populate('Link', [ [ qw/id url title/ ], [ 1, '', 'aaa' ] ]); $schema->populate('Bookmark', [ [ qw/id link/ ], [ 1, 1 ] ]); $schema->populate('Collection', [ [ qw/collectionid name/ ], [ 1, "Tools" ], [ 2, "Body Parts" ], ]); $schema->populate('TypedObject', [ [ qw/objectid type value/ ], [ 1, "pointy", "Awl" ], [ 2, "round", "Bearing" ], [ 3, "pointy", "Knife" ], [ 4, "pointy", "Tooth" ], [ 5, "round", "Head" ], ]); $schema->populate('CollectionObject', [ [ qw/collection object/ ], [ 1, 1 ], [ 1, 2 ], [ 1, 3 ], [ 2, 4 ], [ 2, 5 ], ]); $schema->populate('Owners', [ [ qw/id name/ ], [ 1, "Newton" ], [ 2, "Waltham" ], ]); $schema->populate('BooksInLibrary', [ [ qw/id owner title source price/ ], [ 1, 1, "Programming Perl", "Library", 23 ], [ 2, 1, "Dynamical Systems", "Library", 37 ], [ 3, 2, "Best Recipe Cookbook", "Library", 65 ], ]); } 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/0000775000175000017500000000000013556035664024036 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Util.pm0000644000175000017500000000664413556035664025321 0ustar ahartmaiahartmaipackage DBICTest::Util; use warnings; use strict; use Carp; use Scalar::Util qw/isweak weaken blessed reftype refaddr/; use Config; use base 'Exporter'; our @EXPORT_OK = qw/local_umask stacktrace populate_weakregistry assert_empty_weakregistry/; sub local_umask { return unless defined $Config{d_umask}; die 'Calling local_umask() in void context makes no sense' if ! defined wantarray; my $old_umask = umask(shift()); die "Setting umask failed: $!" unless defined $old_umask; return bless \$old_umask, 'DBICTest::Util::UmaskGuard'; } { package DBICTest::Util::UmaskGuard; sub DESTROY { local ($@, $!); eval { defined (umask ${$_[0]}) or die }; warn ( "Unable to reset old umask ${$_[0]}: " . ($!||'Unknown error') ) if ($@ || $!); } } sub stacktrace { my $frame = shift; $frame++; my (@stack, @frame); while (@frame = caller($frame++)) { push @stack, [@frame[3,1,2]]; } return undef unless @stack; $stack[0][0] = ''; return join "\tinvoked as ", map { sprintf ("%s at %s line %d\n", @$_ ) } @stack; } my $refs_traced = 0; sub populate_weakregistry { my ($reg, $target, $slot) = @_; croak 'Target is not a reference' unless defined ref $target; $slot ||= (sprintf '%s%s(0x%x)', # so we don't trigger stringification (defined blessed $target) ? blessed($target) . '=' : '', reftype $target, refaddr $target, ); if (defined $reg->{$slot}{weakref}) { if ( refaddr($reg->{$slot}{weakref}) != (refaddr $target) ) { print STDERR "Bail out! Weak Registry slot collision: $reg->{$slot}{weakref} / $target\n"; exit 255; } } else { $refs_traced++; weaken( $reg->{$slot}{weakref} = $target ); $reg->{$slot}{stacktrace} = stacktrace(1); } $target; } my $leaks_found; sub assert_empty_weakregistry { my ($weak_registry, $quiet) = @_; croak 'Expecting a registry hashref' unless ref $weak_registry eq 'HASH'; return unless keys %$weak_registry; my $tb = eval { Test::Builder->new } or croak 'Calling test_weakregistry without a loaded Test::Builder makes no sense'; for my $slot (sort keys %$weak_registry) { next if ! defined $weak_registry->{$slot}{weakref}; $tb->BAILOUT("!!!! WEAK REGISTRY SLOT $slot IS NOT A WEAKREF !!!!") unless isweak( $weak_registry->{$slot}{weakref} ); } for my $slot (sort keys %$weak_registry) { ! defined $weak_registry->{$slot}{weakref} and next if $quiet; $tb->ok (! defined $weak_registry->{$slot}{weakref}, "No leaks of $slot") or do { $leaks_found = 1; my $diag = ''; $diag .= Devel::FindRef::track ($weak_registry->{$slot}{weakref}, 20) . "\n" if ( $ENV{TEST_VERBOSE} && eval { require Devel::FindRef }); if (my $stack = $weak_registry->{$slot}{stacktrace}) { $diag .= " Reference first seen$stack"; } $tb->diag($diag) if $diag; }; } } END { if ($INC{'Test/Builder.pm'}) { my $tb = Test::Builder->new; # we check for test passage - a leak may be a part of a TODO if ($leaks_found and !$tb->is_passing) { $tb->diag(sprintf "\n\n%s\n%s\n\nInstall Devel::FindRef and re-run the test with set " . '$ENV{TEST_VERBOSE} (prove -v) to see a more detailed leak-report' . "\n\n%s\n%s\n\n", ('#' x 16) x 4 ) if ( !$ENV{TEST_VERBOSE} or !$INC{'Devel/FindRef.pm'} ); } else { $tb->note("Auto checked $refs_traced references for leaks - none detected"); } } } 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Stats.pm0000644000175000017500000000167313556035664025477 0ustar ahartmaiahartmaipackage DBICTest::Stats; use strict; use warnings; use base qw/DBIx::Class::Storage::Statistics/; sub txn_begin { my $self = shift; $self->{'TXN_BEGIN'}++; return $self->{'TXN_BEGIN'}; } sub txn_rollback { my $self = shift; $self->{'TXN_ROLLBACK'}++; return $self->{'TXN_ROLLBACK'}; } sub txn_commit { my $self = shift; $self->{'TXN_COMMIT'}++; return $self->{'TXN_COMMIT'}; } sub svp_begin { my ($self, $name) = @_; $self->{'SVP_BEGIN'}++; return $self->{'SVP_BEGIN'}; } sub svp_release { my ($self, $name) = @_; $self->{'SVP_RELEASE'}++; return $self->{'SVP_RELEASE'}; } sub svp_rollback { my ($self, $name) = @_; $self->{'SVP_ROLLBACK'}++; return $self->{'SVP_ROLLBACK'}; } sub query_start { my ($self, $string, @bind) = @_; $self->{'QUERY_START'}++; return $self->{'QUERY_START'}; } sub query_end { my ($self, $string) = @_; $self->{'QUERY_END'}++; return $self->{'QUERY_START'}; } 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Cursor.pm0000644000175000017500000000015113556035664025644 0ustar ahartmaiahartmaipackage DBICTest::Cursor; use strict; use warnings; use base qw/DBIx::Class::Storage::DBI::Cursor/; 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema.pm0000644000175000017500000001425013556035664025574 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema; use strict; use warnings; no warnings 'qw'; use base 'DBICTest::BaseSchema'; use Fcntl qw/:DEFAULT :seek :flock/; use Time::HiRes 'sleep'; use DBICTest::RunMode; use DBICTest::Util qw/populate_weakregistry assert_empty_weakregistry local_umask/; use namespace::clean; __PACKAGE__->mk_group_accessors(simple => 'custom_attr'); __PACKAGE__->load_classes(qw/ Artist SequenceTest BindType Employee CD Genre Bookmark Link #dummy Track Tag Year2000CDs Year1999CDs CustomSql Money TimestampPrimaryKey /, { 'DBICTest::Schema' => [qw/ LinerNotes Artwork Artwork_to_Artist Image Lyrics LyricVersion OneKey #dummy TwoKeys Serialized /]}, ( 'FourKeys', 'FourKeys_to_TwoKeys', '#dummy', 'SelfRef', 'ArtistUndirectedMap', 'ArtistSourceName', 'ArtistSubclass', 'Producer', 'CD_to_Producer', 'Dummy', # this is a real result class we remove in the hook below ), qw/SelfRefAlias TreeLike TwoKeyTreeLike Event EventTZ NoPrimaryKey/, qw/Collection CollectionObject TypedObject Owners BooksInLibrary/, qw/ForceForeign Encoded/, ); sub sqlt_deploy_hook { my ($self, $sqlt_schema) = @_; $sqlt_schema->drop_table('dummy'); } our $locker; END { # we need the $locker to be referenced here for delayed destruction if ($locker->{lock_name} and ($ENV{DBICTEST_LOCK_HOLDER}||0) == $$) { #warn "$$ $0 $locktype LOCK RELEASED"; } } my $weak_registry = {}; sub connection { my $self = shift->next::method(@_); # MASSIVE FIXME # we can't really lock based on DSN, as we do not yet have a way to tell that e.g. # DBICTEST_MSSQL_DSN=dbi:Sybase:server=192.168.0.11:1433;database=dbtst # and # DBICTEST_MSSQL_ODBC_DSN=dbi:ODBC:server=192.168.0.11;port=1433;database=dbtst;driver=FreeTDS;tds_version=8.0 # are the same server # hence we lock everything based on sqlt_type or just globally if not available # just pretend we are python you know? :) # when we get a proper DSN resolution sanitize to produce a portable lockfile name # this may look weird and unnecessary, but consider running tests from # windows over a samba share >.> #utf8::encode($dsn); #$dsn =~ s/([^A-Za-z0-9_\-\.\=])/ sprintf '~%02X', ord($1) /ge; #$dsn =~ s/^dbi/dbi/i; # provide locking for physical (non-memory) DSNs, so that tests can # safely run in parallel. While the harness (make -jN test) does set # an envvar, we can not detect when a user invokes prove -jN. Hence # perform the locking at all times, it shouldn't hurt. # the lock fh *should* inherit across forks/subprocesses # # File locking is hard. Really hard. By far the best lock implementation # I've seen is part of the guts of File::Temp. However it is sadly not # reusable. Since I am not aware of folks doing NFS parallel testing, # nor are we known to work on VMS, I am just going to punt this and # use the portable-ish flock() provided by perl itself. If this does # not work for you - patches more than welcome. if ( ! $DBICTest::global_exclusive_lock and ( ! $ENV{DBICTEST_LOCK_HOLDER} or $ENV{DBICTEST_LOCK_HOLDER} == $$ ) and ref($_[0]) ne 'CODE' and ($_[0]||'') !~ /^ (?i:dbi) \: SQLite \: (?: dbname\= )? (?: \:memory\: | t [\/\\] var [\/\\] DBIxClass\-) /x ) { my $locktype = do { # guard against infinite recursion local $ENV{DBICTEST_LOCK_HOLDER} = -1; # we need to connect a forced fresh clone so that we do not upset any state # of the main $schema (some tests examine it quite closely) local $@; my $storage = eval { my $st = ref($self)->connect(@{$self->storage->connect_info})->storage; $st->ensure_connected; # do connect here, to catch a possible throw $st; }; $storage ? do { my $t = $storage->sqlt_type || 'generic'; eval { $storage->disconnect }; $t; } : undef ; }; # Never hold more than one lock. This solves the "lock in order" issues # unrelated tests may have # Also if there is no connection - there is no lock to be had if ($locktype and (!$locker or $locker->{type} ne $locktype)) { warn "$$ $0 $locktype" if ( ($locktype eq 'generic' or $locktype eq 'SQLite') and DBICTest::RunMode->is_author ); my $lockpath = DBICTest::RunMode->tmpdir->file(".dbictest_$locktype.lock"); my $lock_fh; { my $u = local_umask(0); # so that the file opens as 666, and any user can lock sysopen ($lock_fh, $lockpath, O_RDWR|O_CREAT) or die "Unable to open $lockpath: $!"; } flock ($lock_fh, LOCK_EX) or die "Unable to lock $lockpath: $!"; #warn "$$ $0 $locktype LOCK GRABBED"; # see if anyone was holding a lock before us, and wait up to 5 seconds for them to terminate # if we do not do this we may end up trampling over some long-running END or somesuch seek ($lock_fh, 0, SEEK_SET) or die "seek failed $!"; my $old_pid; if ( read ($lock_fh, $old_pid, 100) and ($old_pid) = $old_pid =~ /^(\d+)$/ ) { for (1..50) { kill (0, $old_pid) or last; sleep 0.1; } } #warn "$$ $0 $locktype POST GRAB WAIT"; truncate $lock_fh, 0; seek ($lock_fh, 0, SEEK_SET) or die "seek failed $!"; $lock_fh->autoflush(1); print $lock_fh $$; $ENV{DBICTEST_LOCK_HOLDER} ||= $$; $locker = { type => $locktype, fh => $lock_fh, lock_name => "$lockpath", }; } } if ($INC{'Test/Builder.pm'}) { populate_weakregistry ( $weak_registry, $self->storage ); my $cur_connect_call = $self->storage->on_connect_call; $self->storage->on_connect_call([ (ref $cur_connect_call eq 'ARRAY' ? @$cur_connect_call : ($cur_connect_call || ()) ), [sub { populate_weakregistry( $weak_registry, shift->_dbh ) }], ]); } return $self; } sub clone { my $self = shift->next::method(@_); populate_weakregistry ( $weak_registry, $self ) if $INC{'Test/Builder.pm'}; $self; } END { assert_empty_weakregistry($weak_registry, 'quiet'); } 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/RunMode.pm0000644000175000017500000001225413556035664025747 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::RunMode; use strict; use warnings; BEGIN { if ($INC{'DBIx/Class.pm'}) { my ($fr, @frame) = 1; while (@frame = caller($fr++)) { last if $frame[1] !~ m|^t/lib/DBICTest|; } die __PACKAGE__ . " must be loaded before DBIx::Class (or modules using DBIx::Class) at $frame[1] line $frame[2]\n"; } } use Path::Class qw/file dir/; use File::Spec; _check_author_makefile() unless $ENV{DBICTEST_NO_MAKEFILE_VERIFICATION}; # PathTools has a bug where on MSWin32 it will often return / as a tmpdir. # This is *really* stupid and the result of having our lockfiles all over # the place is also rather obnoxious. So we use our own heuristics instead # https://rt.cpan.org/Ticket/Display.html?id=76663 my $tmpdir; sub tmpdir { dir ($tmpdir ||= do { my $dir = dir(File::Spec->tmpdir); my @parts = File::Spec->splitdir($dir); if (@parts == 2 and $parts[1] eq '') { # This means we were give the root dir (C:\ or something equally unacceptable) # Replace with our local project tmpdir. This will make multiple runs # from different runs conflict with each other, but is much better than # polluting the root dir with random crap $dir = _find_co_root()->subdir('t')->subdir('var'); $dir->mkpath; } $dir->stringify; }); } # Die if the author did not update his makefile # # This is pretty heavy handed, so the check is pretty solid: # # 1) Assume that this particular module is loaded from -I <$root>/t/lib # 2) Make sure <$root>/Makefile.PL exists # 3) Make sure we can stat() <$root>/Makefile.PL # # If all of the above is satisfied # # *) die if <$root>/inc does not exist # *) die if no stat() results for <$root>/Makefile (covers no Makefile) # *) die if Makefile.PL mtime > Makefile mtime # sub _check_author_makefile { my $root = _find_co_root() or return; my $optdeps = file('lib/DBIx/Class/Optional/Dependencies.pm'); # not using file->stat as it invokes File::stat which in turn breaks stat(_) my ($mf_pl_mtime, $mf_mtime, $optdeps_mtime) = ( map { (stat ($root->file ($_)) )[9] || undef } # stat returns () on nonexistent files (qw|Makefile.PL Makefile|, $optdeps) ); return unless $mf_pl_mtime; # something went wrong during co_root detection ? my @fail_reasons; if(not -d $root->subdir ('inc')) { push @fail_reasons, "Missing ./inc directory"; } if(not $mf_mtime) { push @fail_reasons, "Missing ./Makefile"; } else { if($mf_mtime < $mf_pl_mtime) { push @fail_reasons, "./Makefile.PL is newer than ./Makefile"; } if($mf_mtime < $optdeps_mtime) { push @fail_reasons, "./$optdeps is newer than ./Makefile"; } } if (@fail_reasons) { print STDERR <<'EOE'; !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! ======================== FATAL ERROR =========================== !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! We have a number of reasons to believe that this is a development checkout and that you, the user, did not run `perl Makefile.PL` before using this code. You absolutely _must_ perform this step, to ensure you have all required dependencies present. Not doing so often results in a lot of wasted time for other contributors trying to assit you with spurious "its broken!" problems. By default DBICs Makefile.PL turns all optional dependenciess into *HARD REQUIREMENTS*, in order to make sure that the entire test suite is executed, and no tests are skipped due to missing modules. If you for some reason need to disable this behavior - supply the --skip_author_deps option when running perl Makefile.PL If you are seeing this message unexpectedly (i.e. you are in fact attempting a regular installation be it through CPAN or manually), please report the situation to either the mailing list or to the irc channel as described in http://search.cpan.org/dist/DBIx-Class/lib/DBIx/Class.pm#GETTING_HELP/SUPPORT The DBIC team Reasons you received this message: EOE foreach my $r (@fail_reasons) { print STDERR " * $r\n"; } print STDERR "\n\n\n"; exit 1; } } # Mimic $Module::Install::AUTHOR sub is_author { my $root = _find_co_root() or return undef; return ( ( not -d $root->subdir ('inc') ) or ( -e $root->subdir ('inc')->subdir ($^O eq 'VMS' ? '_author' : '.author') ) ); } sub is_smoker { return ( $ENV{AUTOMATED_TESTING} && ! $ENV{PERL5_CPANM_IS_RUNNING} && ! $ENV{RELEASE_TESTING} ) } sub is_plain { return (! __PACKAGE__->is_smoker && ! __PACKAGE__->is_author && ! $ENV{RELEASE_TESTING} ) } # Try to determine the root of a checkout/untar if possible # or return undef sub _find_co_root { my @mod_parts = split /::/, (__PACKAGE__ . '.pm'); my $rel_path = join ('/', @mod_parts); # %INC stores paths with / regardless of OS return undef unless ($INC{$rel_path}); # a bit convoluted, but what we do here essentially is: # - get the file name of this particular module # - do 'cd ..' as many times as necessary to get to t/lib/../.. my $root = dir ($INC{$rel_path}); for (1 .. @mod_parts + 2) { $root = $root->parent; } return (-f $root->file ('Makefile.PL') ) ? $root : undef ; } 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/0000775000175000017500000000000013556035664025236 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/CD.pm0000644000175000017500000000750613556035664026070 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::CD; use base qw/DBICTest::BaseResult/; # this tests table name as scalar ref # DO NOT REMOVE THE \ __PACKAGE__->table(\'cd'); __PACKAGE__->add_columns( 'cdid' => { data_type => 'integer', is_auto_increment => 1, }, 'artist' => { data_type => 'integer', }, 'title' => { data_type => 'varchar', size => 100, }, 'year' => { data_type => 'varchar', size => 100, }, 'genreid' => { data_type => 'integer', is_nullable => 1, accessor => undef, }, 'single_track' => { data_type => 'integer', is_nullable => 1, is_foreign_key => 1, } ); __PACKAGE__->set_primary_key('cdid'); __PACKAGE__->add_unique_constraint([ qw/artist title/ ]); __PACKAGE__->belongs_to( artist => 'DBICTest::Schema::Artist', undef, { is_deferrable => 1, proxy => { artist_name => 'name' }, }); __PACKAGE__->belongs_to( very_long_artist_relationship => 'DBICTest::Schema::Artist', 'artist', { is_deferrable => 1, }); # in case this is a single-cd it promotes a track from another cd __PACKAGE__->belongs_to( single_track => 'DBICTest::Schema::Track', 'single_track', { join_type => 'left'} ); # add a non-left single relationship for the complex prefetch tests __PACKAGE__->belongs_to( existing_single_track => 'DBICTest::Schema::Track', 'single_track'); __PACKAGE__->has_many( tracks => 'DBICTest::Schema::Track' ); __PACKAGE__->has_many( tags => 'DBICTest::Schema::Tag', undef, { order_by => 'tag' }, ); __PACKAGE__->has_many( cd_to_producer => 'DBICTest::Schema::CD_to_Producer' => 'cd' ); __PACKAGE__->might_have( liner_notes => 'DBICTest::Schema::LinerNotes', undef, { proxy => [ qw/notes/ ] }, ); __PACKAGE__->might_have(artwork => 'DBICTest::Schema::Artwork', 'cd_id'); __PACKAGE__->has_one(mandatory_artwork => 'DBICTest::Schema::Artwork', 'cd_id'); __PACKAGE__->many_to_many( producers => cd_to_producer => 'producer' ); __PACKAGE__->many_to_many( producers_sorted => cd_to_producer => 'producer', { order_by => 'producer.name' }, ); __PACKAGE__->belongs_to('genre', 'DBICTest::Schema::Genre', { 'foreign.genreid' => 'self.genreid' }, { join_type => 'left', on_delete => 'SET NULL', on_update => 'CASCADE', }, ); #This second relationship was added to test the short-circuiting of pointless #queries provided by undef_on_null_fk. the relevant test in 66relationship.t __PACKAGE__->belongs_to('genre_inefficient', 'DBICTest::Schema::Genre', { 'foreign.genreid' => 'self.genreid' }, { join_type => 'left', on_delete => 'SET NULL', on_update => 'CASCADE', undef_on_null_fk => 0, }, ); # This is insane. Don't ever do anything like that # This is for testing purposes only! # mst: mo: DBIC is an "object relational mapper" # mst: mo: not an "object relational hider-because-mo-doesn't-understand-databases # ribasushi: mo: try it with a subselect nevertheless, I'd love to be proven wrong # ribasushi: mo: does sqlite actually take this? # ribasushi: an order in a correlated subquery is insane - how long does it take you on real data? __PACKAGE__->might_have( 'last_track', 'DBICTest::Schema::Track', sub { my $args = shift; return ( { "$args->{foreign_alias}.trackid" => { '=' => $args->{self_resultsource}->schema->resultset('Track')->search( { 'correlated_tracks.cd' => { -ident => "$args->{self_alias}.cdid" } }, { order_by => { -desc => 'position' }, rows => 1, alias => 'correlated_tracks', columns => ['trackid'] }, )->as_query } } ); }, ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/Tag.pm0000644000175000017500000000137313556035664026311 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::Tag; use base qw/DBICTest::BaseResult/; __PACKAGE__->table('tags'); __PACKAGE__->add_columns( 'tagid' => { data_type => 'integer', is_auto_increment => 1, }, 'cd' => { data_type => 'integer', }, 'tag' => { data_type => 'varchar', size => 100, }, ); __PACKAGE__->set_primary_key('tagid'); __PACKAGE__->add_unique_constraints( # do not remove, part of a test tagid_cd => [qw/ tagid cd /], tagid_cd_tag => [qw/ tagid cd tag /], ); __PACKAGE__->add_unique_constraints( # do not remove, part of a test [qw/ tagid tag /], [qw/ tagid tag cd /], ); __PACKAGE__->belongs_to( cd => 'DBICTest::Schema::CD', 'cd', { proxy => [ 'year', { cd_title => 'title' } ], }); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/Link.pm0000644000175000017500000000124013556035664026464 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::Link; use base qw/DBICTest::BaseResult/; use strict; use warnings; __PACKAGE__->table('link'); __PACKAGE__->add_columns( 'id' => { data_type => 'integer', is_auto_increment => 1 }, 'url' => { data_type => 'varchar', size => 100, is_nullable => 1, }, 'title' => { data_type => 'varchar', size => 100, is_nullable => 1, }, ); __PACKAGE__->set_primary_key('id'); __PACKAGE__->has_many ( bookmarks => 'DBICTest::Schema::Bookmark', 'link', { cascade_delete => 0 } ); use overload '""' => sub { shift->url }, fallback=> 1; 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/Money.pm0000644000175000017500000000052313556035664026661 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::Money; use base qw/DBICTest::BaseResult/; __PACKAGE__->table('money_test'); __PACKAGE__->add_columns( 'id' => { data_type => 'integer', is_auto_increment => 1, }, 'amount' => { data_type => 'money', is_nullable => 1, }, ); __PACKAGE__->set_primary_key('id'); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/Dummy.pm0000644000175000017500000000073413556035664026671 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::Dummy; use base qw/DBICTest::BaseResult/; use strict; use warnings; __PACKAGE__->table('dummy'); __PACKAGE__->add_columns( 'id' => { data_type => 'integer', is_auto_increment => 1 }, 'gittery' => { data_type => 'varchar', size => 100, is_nullable => 1, }, ); __PACKAGE__->set_primary_key('id'); # part of a test, do not remove __PACKAGE__->sequence('bogus'); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/Track.pm0000644000175000017500000000524213556035664026641 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::Track; use base qw/DBICTest::BaseResult/; use Carp qw/confess/; __PACKAGE__->load_components(qw{ +DBICTest::DeployComponent InflateColumn::DateTime Ordered }); __PACKAGE__->table('track'); __PACKAGE__->add_columns( 'trackid' => { data_type => 'integer', is_auto_increment => 1, }, 'cd' => { data_type => 'integer', }, 'position' => { data_type => 'int', accessor => 'pos', }, 'title' => { data_type => 'varchar', size => 100, }, last_updated_on => { data_type => 'datetime', accessor => 'updated_date', is_nullable => 1 }, last_updated_at => { data_type => 'datetime', is_nullable => 1 }, ); __PACKAGE__->set_primary_key('trackid'); __PACKAGE__->add_unique_constraint([ qw/cd position/ ]); __PACKAGE__->add_unique_constraint([ qw/cd title/ ]); __PACKAGE__->position_column ('position'); __PACKAGE__->grouping_column ('cd'); __PACKAGE__->belongs_to( cd => 'DBICTest::Schema::CD', undef, { proxy => { cd_title => 'title' }, }); __PACKAGE__->belongs_to( disc => 'DBICTest::Schema::CD' => 'cd', { proxy => 'year' }); __PACKAGE__->might_have( cd_single => 'DBICTest::Schema::CD', 'single_track' ); __PACKAGE__->might_have( lyrics => 'DBICTest::Schema::Lyrics', 'track_id' ); __PACKAGE__->belongs_to( "year1999cd", "DBICTest::Schema::Year1999CDs", { "foreign.cdid" => "self.cd" }, { join_type => 'left' }, # the relationship is of course optional ); __PACKAGE__->belongs_to( "year2000cd", "DBICTest::Schema::Year2000CDs", { "foreign.cdid" => "self.cd" }, { join_type => 'left' }, ); __PACKAGE__->has_many ( next_tracks => __PACKAGE__, sub { my $args = shift; # This is for test purposes only. A regular user does not # need to sanity check the passed-in arguments, this is what # the tests are for :) my @missing_args = grep { ! defined $args->{$_} } qw/self_alias foreign_alias self_resultsource foreign_relname/; confess "Required arguments not supplied to custom rel coderef: @missing_args\n" if @missing_args; return ( { "$args->{foreign_alias}.cd" => { -ident => "$args->{self_alias}.cd" }, "$args->{foreign_alias}.position" => { '>' => { -ident => "$args->{self_alias}.position" } }, }, $args->{self_rowobj} && { "$args->{foreign_alias}.cd" => $args->{self_rowobj}->get_column('cd'), "$args->{foreign_alias}.position" => { '>' => $args->{self_rowobj}->pos }, } ) } ); our $hook_cb; sub sqlt_deploy_hook { my $class = shift; $hook_cb->($class, @_) if $hook_cb; $class->next::method(@_) if $class->next::can; } 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/Event.pm0000644000175000017500000000200713556035664026652 0ustar ahartmaiahartmaipackage DBICTest::Schema::Event; use strict; use warnings; use base qw/DBICTest::BaseResult/; __PACKAGE__->load_components(qw/InflateColumn::DateTime/); __PACKAGE__->table('event'); __PACKAGE__->add_columns( id => { data_type => 'integer', is_auto_increment => 1 }, # this MUST be 'date' for the Firebird and SQLAnywhere tests starts_at => { data_type => 'date', datetime_undef_if_invalid => 1 }, created_on => { data_type => 'timestamp' }, varchar_date => { data_type => 'varchar', size => 20, is_nullable => 1 }, varchar_datetime => { data_type => 'varchar', size => 20, is_nullable => 1 }, skip_inflation => { data_type => 'datetime', inflate_datetime => 0, is_nullable => 1 }, ts_without_tz => { data_type => 'datetime', is_nullable => 1 }, # used in EventTZPg ); __PACKAGE__->set_primary_key('id'); # Test add_columns '+colname' to augment a column definition. __PACKAGE__->add_columns( '+varchar_date' => { inflate_date => 1, }, '+varchar_datetime' => { inflate_datetime => 1, }, ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/Genre.pm0000644000175000017500000000103713556035664026633 0ustar ahartmaiahartmaipackage DBICTest::Schema::Genre; use strict; use base qw/DBICTest::BaseResult/; __PACKAGE__->table('genre'); __PACKAGE__->add_columns( genreid => { data_type => 'integer', is_auto_increment => 1, }, name => { data_type => 'varchar', size => 100, }, ); __PACKAGE__->set_primary_key('genreid'); __PACKAGE__->add_unique_constraint ( genre_name => [qw/name/] ); __PACKAGE__->has_many (cds => 'DBICTest::Schema::CD', 'genreid'); __PACKAGE__->has_one (model_cd => 'DBICTest::Schema::CD', 'genreid'); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/Image.pm0000644000175000017500000000104713556035664026616 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::Image; use base qw/DBICTest::BaseResult/; __PACKAGE__->table('images'); __PACKAGE__->add_columns( 'id' => { data_type => 'integer', is_auto_increment => 1, }, 'artwork_id' => { data_type => 'integer', is_foreign_key => 1, }, 'name' => { data_type => 'varchar', size => 100, }, 'data' => { data_type => 'blob', is_nullable => 1, }, ); __PACKAGE__->set_primary_key('id'); __PACKAGE__->belongs_to('artwork', 'DBICTest::Schema::Artwork', 'artwork_id'); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/Lyrics.pm0000644000175000017500000000100113556035664027027 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::Lyrics; use base qw/DBICTest::BaseResult/; __PACKAGE__->table('lyrics'); __PACKAGE__->add_columns( 'lyric_id' => { data_type => 'integer', is_auto_increment => 1, }, 'track_id' => { data_type => 'integer', is_foreign_key => 1, }, ); __PACKAGE__->set_primary_key('lyric_id'); __PACKAGE__->belongs_to('track', 'DBICTest::Schema::Track', 'track_id'); __PACKAGE__->has_many('lyric_versions', 'DBICTest::Schema::LyricVersion', 'lyric_id'); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/Owners.pm0000644000175000017500000000071513556035664027052 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::Owners; use base qw/DBICTest::BaseResult/; __PACKAGE__->table('owners'); __PACKAGE__->add_columns( 'id' => { data_type => 'integer', is_auto_increment => 1, }, 'name' => { data_type => 'varchar', size => '100', }, ); __PACKAGE__->set_primary_key('id'); __PACKAGE__->add_unique_constraint(['name']); __PACKAGE__->has_many(books => "DBICTest::Schema::BooksInLibrary", "owner"); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/Artist.pm0000644000175000017500000001111013556035664027032 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::Artist; use base qw/DBICTest::BaseResult/; use Carp qw/confess/; __PACKAGE__->table('artist'); __PACKAGE__->source_info({ "source_info_key_A" => "source_info_value_A", "source_info_key_B" => "source_info_value_B", "source_info_key_C" => "source_info_value_C", }); __PACKAGE__->add_columns( 'artistid' => { data_type => 'integer', is_auto_increment => 1, }, 'name' => { data_type => 'varchar', size => 100, is_nullable => 1, }, rank => { data_type => 'integer', default_value => 13, }, charfield => { data_type => 'char', size => 10, is_nullable => 1, }, ); __PACKAGE__->set_primary_key('artistid'); __PACKAGE__->add_unique_constraint(['name']); __PACKAGE__->add_unique_constraint(artist => ['artistid']); # do not remove, part of a test __PACKAGE__->add_unique_constraint(u_nullable => [qw/charfield rank/]); __PACKAGE__->mk_classdata('field_name_for', { artistid => 'primary key', name => 'artist name', }); __PACKAGE__->has_many( cds => 'DBICTest::Schema::CD', undef, { order_by => { -asc => 'year'} }, ); __PACKAGE__->has_many( cds_80s => 'DBICTest::Schema::CD', sub { my $args = shift; # This is for test purposes only. A regular user does not # need to sanity check the passed-in arguments, this is what # the tests are for :) my @missing_args = grep { ! defined $args->{$_} } qw/self_alias foreign_alias self_resultsource foreign_relname/; confess "Required arguments not supplied to custom rel coderef: @missing_args\n" if @missing_args; return ( { "$args->{foreign_alias}.artist" => { '=' => { -ident => "$args->{self_alias}.artistid"} }, "$args->{foreign_alias}.year" => { '>' => 1979, '<' => 1990 }, }, $args->{self_rowobj} && { "$args->{foreign_alias}.artist" => $args->{self_rowobj}->artistid, "$args->{foreign_alias}.year" => { '>' => 1979, '<' => 1990 }, } ); }, ); __PACKAGE__->has_many( cds_84 => 'DBICTest::Schema::CD', sub { my $args = shift; # This is for test purposes only. A regular user does not # need to sanity check the passed-in arguments, this is what # the tests are for :) my @missing_args = grep { ! defined $args->{$_} } qw/self_alias foreign_alias self_resultsource foreign_relname/; confess "Required arguments not supplied to custom rel coderef: @missing_args\n" if @missing_args; return ( { "$args->{foreign_alias}.artist" => { -ident => "$args->{self_alias}.artistid" }, "$args->{foreign_alias}.year" => 1984, }, $args->{self_rowobj} && { "$args->{foreign_alias}.artist" => $args->{self_rowobj}->artistid, "$args->{foreign_alias}.year" => 1984, } ); } ); __PACKAGE__->has_many( cds_90s => 'DBICTest::Schema::CD', sub { my $args = shift; # This is for test purposes only. A regular user does not # need to sanity check the passed-in arguments, this is what # the tests are for :) my @missing_args = grep { ! defined $args->{$_} } qw/self_alias foreign_alias self_resultsource foreign_relname/; confess "Required arguments not supplied to custom rel coderef: @missing_args\n" if @missing_args; return ( { "$args->{foreign_alias}.artist" => { -ident => "$args->{self_alias}.artistid" }, "$args->{foreign_alias}.year" => { '>' => 1989, '<' => 2000 }, } ); } ); __PACKAGE__->has_many( cds_unordered => 'DBICTest::Schema::CD' ); __PACKAGE__->has_many( cds_very_very_very_long_relationship_name => 'DBICTest::Schema::CD' ); __PACKAGE__->has_many( twokeys => 'DBICTest::Schema::TwoKeys' ); __PACKAGE__->has_many( onekeys => 'DBICTest::Schema::OneKey' ); __PACKAGE__->has_many( artist_undirected_maps => 'DBICTest::Schema::ArtistUndirectedMap', [ {'foreign.id1' => 'self.artistid'}, {'foreign.id2' => 'self.artistid'} ], { cascade_copy => 0 } # this would *so* not make sense ); __PACKAGE__->has_many( artwork_to_artist => 'DBICTest::Schema::Artwork_to_Artist' => 'artist_id' ); __PACKAGE__->many_to_many('artworks', 'artwork_to_artist', 'artwork'); sub sqlt_deploy_hook { my ($self, $sqlt_table) = @_; if ($sqlt_table->schema->translator->producer_type =~ /SQLite$/ ) { $sqlt_table->add_index( name => 'artist_name_hookidx', fields => ['name'] ) or die $sqlt_table->error; } } sub store_column { my ($self, $name, $value) = @_; $value = 'X '.$value if ($name eq 'name' && $value && $value =~ /(X )?store_column test/); $self->next::method($name, $value); } 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/OneKey.pm0000644000175000017500000000055013556035664026764 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::OneKey; use base qw/DBICTest::BaseResult/; __PACKAGE__->table('onekey'); __PACKAGE__->add_columns( 'id' => { data_type => 'integer', is_auto_increment => 1, }, 'artist' => { data_type => 'integer', }, 'cd' => { data_type => 'integer', }, ); __PACKAGE__->set_primary_key('id'); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/EventTZ.pm0000644000175000017500000000117513556035664027135 0ustar ahartmaiahartmaipackage DBICTest::Schema::EventTZ; use strict; use warnings; use base qw/DBICTest::BaseResult/; __PACKAGE__->load_components(qw/InflateColumn::DateTime/); __PACKAGE__->table('event'); __PACKAGE__->add_columns( id => { data_type => 'integer', is_auto_increment => 1 }, starts_at => { data_type => 'datetime', timezone => "America/Chicago", locale => 'de_DE', datetime_undef_if_invalid => 1 }, created_on => { data_type => 'timestamp', timezone => "America/Chicago", floating_tz_ok => 1 }, ); __PACKAGE__->set_primary_key('id'); sub _datetime_parser { require DateTime::Format::MySQL; DateTime::Format::MySQL->new(); } 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/SelfRef.pm0000644000175000017500000000064613556035664027126 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::SelfRef; use base qw/DBICTest::BaseResult/; __PACKAGE__->table('self_ref'); __PACKAGE__->add_columns( 'id' => { data_type => 'integer', is_auto_increment => 1, }, 'name' => { data_type => 'varchar', size => 100, }, ); __PACKAGE__->set_primary_key('id'); __PACKAGE__->has_many( aliases => 'DBICTest::Schema::SelfRefAlias' => 'self_ref' ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/Artwork.pm0000644000175000017500000000324713556035664027231 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::Artwork; use base qw/DBICTest::BaseResult/; use Carp qw/confess/; __PACKAGE__->table('cd_artwork'); __PACKAGE__->add_columns( 'cd_id' => { data_type => 'integer', is_nullable => 0, }, ); __PACKAGE__->set_primary_key('cd_id'); __PACKAGE__->belongs_to('cd', 'DBICTest::Schema::CD', 'cd_id'); __PACKAGE__->has_many('images', 'DBICTest::Schema::Image', 'artwork_id'); __PACKAGE__->has_many('artwork_to_artist', 'DBICTest::Schema::Artwork_to_Artist', 'artwork_cd_id'); __PACKAGE__->many_to_many('artists', 'artwork_to_artist', 'artist'); # both to test manytomany with custom rel __PACKAGE__->many_to_many('artists_test_m2m', 'artwork_to_artist', 'artist_test_m2m'); __PACKAGE__->many_to_many('artists_test_m2m_noopt', 'artwork_to_artist', 'artist_test_m2m_noopt'); # other test to manytomany __PACKAGE__->has_many('artwork_to_artist_test_m2m', 'DBICTest::Schema::Artwork_to_Artist', sub { my $args = shift; # This is for test purposes only. A regular user does not # need to sanity check the passed-in arguments, this is what # the tests are for :) my @missing_args = grep { ! defined $args->{$_} } qw/self_alias foreign_alias self_resultsource foreign_relname/; confess "Required arguments not supplied to custom rel coderef: @missing_args\n" if @missing_args; return ( { "$args->{foreign_alias}.artwork_cd_id" => { -ident => "$args->{self_alias}.cd_id" }, }, $args->{self_rowobj} && { "$args->{foreign_alias}.artwork_cd_id" => $args->{self_rowobj}->cd_id, } ); } ); __PACKAGE__->many_to_many('artists_test_m2m2', 'artwork_to_artist_test_m2m', 'artist'); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/Encoded.pm0000644000175000017500000000150213556035664027131 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::Encoded; use base qw/DBICTest::BaseResult/; use strict; use warnings; __PACKAGE__->table('encoded'); __PACKAGE__->add_columns( 'id' => { data_type => 'integer', is_auto_increment => 1 }, 'encoded' => { data_type => 'varchar', size => 100, is_nullable => 1, }, ); __PACKAGE__->set_primary_key('id'); __PACKAGE__->has_many (keyholders => 'DBICTest::Schema::Employee', 'encoded'); sub set_column { my ($self, $col, $value) = @_; if( $col eq 'encoded' ){ $value = reverse split '', $value; } $self->next::method($col, $value); } sub new { my($self, $attr, @rest) = @_; $attr->{encoded} = reverse split '', $attr->{encoded} if defined $attr->{encoded}; return $self->next::method($attr, @rest); } 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/TwoKeys.pm0000644000175000017500000000136113556035664027200 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::TwoKeys; use base qw/DBICTest::BaseResult/; __PACKAGE__->table('twokeys'); __PACKAGE__->add_columns( 'artist' => { data_type => 'integer' }, 'cd' => { data_type => 'integer' }, ); __PACKAGE__->set_primary_key(qw/artist cd/); __PACKAGE__->belongs_to( artist => 'DBICTest::Schema::Artist', {'foreign.artistid'=>'self.artist'}, ); __PACKAGE__->belongs_to( cd => 'DBICTest::Schema::CD', undef, { is_deferrable => 0, add_fk_index => 0 } ); __PACKAGE__->has_many( 'fourkeys_to_twokeys', 'DBICTest::Schema::FourKeys_to_TwoKeys', { 'foreign.t_artist' => 'self.artist', 'foreign.t_cd' => 'self.cd', }); __PACKAGE__->many_to_many( 'fourkeys', 'fourkeys_to_twokeys', 'fourkeys', ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/Employee.pm0000644000175000017500000000207713556035664027357 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::Employee; use base qw/DBICTest::BaseResult/; __PACKAGE__->load_components(qw( Ordered )); __PACKAGE__->table('employee'); __PACKAGE__->add_columns( employee_id => { data_type => 'integer', is_auto_increment => 1 }, position => { data_type => 'integer', }, group_id => { data_type => 'integer', is_nullable => 1, }, group_id_2 => { data_type => 'integer', is_nullable => 1, }, group_id_3 => { data_type => 'integer', is_nullable => 1, }, name => { data_type => 'varchar', size => 100, is_nullable => 1, }, encoded => { data_type => 'integer', is_nullable => 1, }, ); __PACKAGE__->set_primary_key('employee_id'); __PACKAGE__->position_column('position'); # Do not add unique constraints here - different groups are used throughout # the ordered tests __PACKAGE__->belongs_to (secretkey => 'DBICTest::Schema::Encoded', 'encoded', { join_type => 'left' }); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/Bookmark.pm0000644000175000017500000000123413556035664027337 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::Bookmark; use base qw/DBICTest::BaseResult/; use strict; use warnings; __PACKAGE__->table('bookmark'); __PACKAGE__->add_columns( 'id' => { data_type => 'integer', is_auto_increment => 1 }, 'link' => { data_type => 'integer', is_nullable => 1, }, ); __PACKAGE__->set_primary_key('id'); require DBICTest::Schema::Link; # so we can get a columnlist __PACKAGE__->belongs_to( link => 'DBICTest::Schema::Link', 'link', { on_delete => 'SET NULL', join_type => 'LEFT', proxy => { map { join('_', 'link', $_) => $_ } DBICTest::Schema::Link->columns }, }); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/BindType.pm0000644000175000017500000000104013556035664027303 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::BindType; use base qw/DBICTest::BaseResult/; __PACKAGE__->table('bindtype_test'); __PACKAGE__->add_columns( 'id' => { data_type => 'integer', is_auto_increment => 1, }, 'bytea' => { data_type => 'bytea', is_nullable => 1, }, 'blob' => { data_type => 'blob', is_nullable => 1, }, 'clob' => { data_type => 'clob', is_nullable => 1, }, 'a_memo' => { data_type => 'memo', is_nullable => 1, }, ); __PACKAGE__->set_primary_key('id'); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/TreeLike.pm0000644000175000017500000000160513556035664027300 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::TreeLike; use base qw/DBICTest::BaseResult/; __PACKAGE__->table('treelike'); __PACKAGE__->add_columns( 'id' => { data_type => 'integer', is_auto_increment => 1 }, 'parent' => { data_type => 'integer' , is_nullable=>1}, 'name' => { data_type => 'varchar', size => 100, }, ); __PACKAGE__->set_primary_key(qw/id/); __PACKAGE__->belongs_to('parent', 'TreeLike', { 'foreign.id' => 'self.parent' }); __PACKAGE__->has_many('children', 'TreeLike', { 'foreign.parent' => 'self.id' }); ## since this is a self referential table we need to do a post deploy hook and get ## some data in while constraints are off sub sqlt_deploy_hook { my ($self, $sqlt_table) = @_; ## We don't seem to need this anymore, but keeping it for the moment ## $sqlt_table->add_index(name => 'idx_name', fields => ['name']); } 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/FourKeys.pm0000644000175000017500000000146613556035664027350 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::FourKeys; use base qw/DBICTest::BaseResult/; __PACKAGE__->table('fourkeys'); __PACKAGE__->add_columns( 'foo' => { data_type => 'integer' }, 'bar' => { data_type => 'integer' }, 'hello' => { data_type => 'integer' }, 'goodbye' => { data_type => 'integer' }, 'sensors' => { data_type => 'character', size => 10 }, 'read_count' => { data_type => 'int', is_nullable => 1 }, ); __PACKAGE__->set_primary_key(qw/foo bar hello goodbye/); __PACKAGE__->has_many( 'fourkeys_to_twokeys', 'DBICTest::Schema::FourKeys_to_TwoKeys', { 'foreign.f_foo' => 'self.foo', 'foreign.f_bar' => 'self.bar', 'foreign.f_hello' => 'self.hello', 'foreign.f_goodbye' => 'self.goodbye', }); __PACKAGE__->many_to_many( 'twokeys', 'fourkeys_to_twokeys', 'twokeys', ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/Producer.pm0000644000175000017500000000107313556035664027356 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::Producer; use base qw/DBICTest::BaseResult/; __PACKAGE__->table('producer'); __PACKAGE__->add_columns( 'producerid' => { data_type => 'integer', is_auto_increment => 1 }, 'name' => { data_type => 'varchar', size => 100, }, ); __PACKAGE__->set_primary_key('producerid'); __PACKAGE__->add_unique_constraint(prod_name => [ qw/name/ ]); __PACKAGE__->has_many( producer_to_cd => 'DBICTest::Schema::CD_to_Producer' => 'producer' ); __PACKAGE__->many_to_many('cds', 'producer_to_cd', 'cd'); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/EventTZPg.pm0000644000175000017500000000153113556035664027420 0ustar ahartmaiahartmaipackage DBICTest::Schema::EventTZPg; use strict; use warnings; use base qw/DBICTest::BaseResult/; __PACKAGE__->load_components(qw/InflateColumn::DateTime/); __PACKAGE__->table('event'); __PACKAGE__->add_columns( id => { data_type => 'integer', is_auto_increment => 1 }, starts_at => { data_type => 'datetime', timezone => "America/Chicago", locale => 'de_DE' }, created_on => { data_type => 'timestamp with time zone', timezone => "America/Chicago" }, ts_without_tz => { data_type => 'timestamp without time zone' }, ); __PACKAGE__->set_primary_key('id'); sub _datetime_parser { require DateTime::Format::Pg; DateTime::Format::Pg->new(); } # this is for a reentrancy test, the duplication from above is intentional __PACKAGE__->add_columns( ts_without_tz => { data_type => 'timestamp without time zone', inflate_datetime => 1 }, ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/CustomSql.pm0000644000175000017500000000060513556035664027525 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::CustomSql; use base qw/DBICTest::Schema::Artist/; __PACKAGE__->table('dummy'); __PACKAGE__->result_source_instance->name(\<schema->drop_table($_[1]) } 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/ArtistGUID.pm0000644000175000017500000000131713556035664027513 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::ArtistGUID; use base qw/DBICTest::BaseResult/; # test MSSQL uniqueidentifier type __PACKAGE__->table('artist_guid'); __PACKAGE__->add_columns( 'artistid' => { data_type => 'uniqueidentifier' # auto_nextval not necessary for PK }, 'name' => { data_type => 'varchar', size => 100, is_nullable => 1, }, rank => { data_type => 'integer', default_value => 13, }, charfield => { data_type => 'char', size => 10, is_nullable => 1, }, a_guid => { data_type => 'uniqueidentifier', auto_nextval => 1, # necessary here, because not a PK is_nullable => 1, } ); __PACKAGE__->set_primary_key('artistid'); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/LinerNotes.pm0000644000175000017500000000061713556035664027660 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::LinerNotes; use base qw/DBICTest::BaseResult/; __PACKAGE__->table('liner_notes'); __PACKAGE__->add_columns( 'liner_id' => { data_type => 'integer', }, 'notes' => { data_type => 'varchar', size => 100, }, ); __PACKAGE__->set_primary_key('liner_id'); __PACKAGE__->belongs_to( 'cd', 'DBICTest::Schema::CD', 'liner_id' ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/VaryingMAX.pm0000644000175000017500000000120713556035664027557 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::VaryingMAX; use base qw/DBICTest::BaseResult/; # Test VARCHAR(MAX) type for MSSQL (used in ADO tests) __PACKAGE__->table('varying_max_test'); __PACKAGE__->add_columns( 'id' => { data_type => 'integer', is_auto_increment => 1, }, 'varchar_max' => { data_type => 'varchar', size => 'max', is_nullable => 1, }, 'nvarchar_max' => { data_type => 'nvarchar', size => 'max', is_nullable => 1, }, 'varbinary_max' => { data_type => 'varbinary(max)', # alternately size => undef, is_nullable => 1, }, ); __PACKAGE__->set_primary_key('id'); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/Collection.pm0000644000175000017500000000167513556035664027676 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::Collection; use base qw/DBICTest::BaseResult/; __PACKAGE__->table('collection'); __PACKAGE__->add_columns( 'collectionid' => { data_type => 'integer', is_auto_increment => 1, }, 'name' => { data_type => 'varchar', size => 100, }, ); __PACKAGE__->set_primary_key('collectionid'); __PACKAGE__->has_many( collection_object => "DBICTest::Schema::CollectionObject", { "foreign.collection" => "self.collectionid" } ); __PACKAGE__->many_to_many( objects => collection_object => "object" ); __PACKAGE__->many_to_many( pointy_objects => collection_object => "object", { where => { "object.type" => "pointy" } } ); __PACKAGE__->many_to_many( round_objects => collection_object => "object", { where => { "object.type" => "round" } } ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/Serialized.pm0000644000175000017500000000046113556035664027666 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::Serialized; use base qw/DBICTest::BaseResult/; __PACKAGE__->table('serialized'); __PACKAGE__->add_columns( 'id' => { data_type => 'integer', is_auto_increment => 1 }, 'serialized' => { data_type => 'text' }, ); __PACKAGE__->set_primary_key('id'); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/TypedObject.pm0000644000175000017500000000124613556035664030011 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::TypedObject; use base qw/DBICTest::BaseResult/; __PACKAGE__->table('typed_object'); __PACKAGE__->add_columns( 'objectid' => { data_type => 'integer', is_auto_increment => 1, }, 'type' => { data_type => 'varchar', size => '100', }, 'value' => { data_type => 'varchar', size => 100, }, ); __PACKAGE__->set_primary_key('objectid'); __PACKAGE__->has_many( collection_object => "DBICTest::Schema::CollectionObject", { "foreign.object" => "self.objectid" } ); __PACKAGE__->many_to_many( collections => collection_object => "collection" ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/NoSuchClass.pm0000644000175000017500000000017113556035664027756 0ustar ahartmaiahartmaipackage DBICTest::Schema::NoSuchClass; ## This is purposefully not a real DBIC class ## Used in t/102load_classes.t 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/Year2000CDs.pm0000644000175000017500000000106413556035664027367 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::Year2000CDs; use base qw/DBICTest::Schema::CD/; __PACKAGE__->table_class('DBIx::Class::ResultSource::View'); __PACKAGE__->table('year2000cds'); # need to operate on the instance for things to work __PACKAGE__->result_source_instance->view_definition( sprintf ( 'SELECT %s FROM cd WHERE year = "2000"', join (', ', __PACKAGE__->columns), )); __PACKAGE__->belongs_to( artist => 'DBICTest::Schema::Artist' ); __PACKAGE__->has_many( tracks => 'DBICTest::Schema::Track', { "foreign.cd" => "self.cdid" }); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/Year1999CDs.pm0000644000175000017500000000173213556035664027423 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::Year1999CDs; ## Used in 104view.t use base qw/DBICTest::BaseResult/; __PACKAGE__->table_class('DBIx::Class::ResultSource::View'); __PACKAGE__->table('year1999cds'); __PACKAGE__->result_source_instance->is_virtual(1); __PACKAGE__->result_source_instance->view_definition( "SELECT cdid, artist, title, single_track FROM cd WHERE year ='1999'" ); __PACKAGE__->add_columns( 'cdid' => { data_type => 'integer', is_auto_increment => 1, }, 'artist' => { data_type => 'integer', }, 'title' => { data_type => 'varchar', size => 100, }, 'single_track' => { data_type => 'integer', is_nullable => 1, is_foreign_key => 1, }, ); __PACKAGE__->set_primary_key('cdid'); __PACKAGE__->add_unique_constraint([ qw/artist title/ ]); __PACKAGE__->belongs_to( artist => 'DBICTest::Schema::Artist' ); __PACKAGE__->has_many( tracks => 'DBICTest::Schema::Track', { "foreign.cd" => "self.cdid" }); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/ForceForeign.pm0000644000175000017500000000155413556035664030147 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::ForceForeign; use base qw/DBICTest::BaseResult/; __PACKAGE__->table('forceforeign'); __PACKAGE__->add_columns( 'artist' => { data_type => 'integer' }, 'cd' => { data_type => 'integer' }, ); __PACKAGE__->set_primary_key(qw/artist/); # Normally this would not appear as a FK constraint # since it uses the PK __PACKAGE__->might_have('artist_1', 'DBICTest::Schema::Artist', { 'foreign.artistid' => 'self.artist' }, { is_foreign_key_constraint => 1 }, ); # Normally this would appear as a FK constraint __PACKAGE__->might_have('cd_1', 'DBICTest::Schema::CD', { 'foreign.cdid' => 'self.cd' }, { is_foreign_key_constraint => 0 }, ); # Normally this would appear as a FK constraint __PACKAGE__->belongs_to('cd_3', 'DBICTest::Schema::CD', { 'foreign.cdid' => 'self.cd' }, { is_foreign_key_constraint => 0 }, ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/SequenceTest.pm0000644000175000017500000000151713556035664030206 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::SequenceTest; use base qw/DBICTest::BaseResult/; __PACKAGE__->table('sequence_test'); __PACKAGE__->source_info({ "source_info_key_A" => "source_info_value_A", "source_info_key_B" => "source_info_value_B", "source_info_key_C" => "source_info_value_C", "source_info_key_D" => "source_info_value_D", }); __PACKAGE__->add_columns( 'pkid1' => { data_type => 'integer', auto_nextval => 1, sequence => \'"pkid1_seq"', }, 'pkid2' => { data_type => 'integer', auto_nextval => 1, sequence => \'pkid2_seq', }, 'nonpkid' => { data_type => 'integer', auto_nextval => 1, sequence => 'nonpkid_seq', }, 'name' => { data_type => 'varchar', size => 100, is_nullable => 1, }, ); __PACKAGE__->set_primary_key('pkid1', 'pkid2'); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/EventSmallDT.pm0000644000175000017500000000061613556035664030077 0ustar ahartmaiahartmaipackage DBICTest::Schema::EventSmallDT; use strict; use warnings; use base qw/DBICTest::BaseResult/; __PACKAGE__->load_components(qw/InflateColumn::DateTime/); __PACKAGE__->table('event_small_dt'); __PACKAGE__->add_columns( id => { data_type => 'integer', is_auto_increment => 1 }, small_dt => { data_type => 'smalldatetime', is_nullable => 1 }, ); __PACKAGE__->set_primary_key('id'); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/NoPrimaryKey.pm0000644000175000017500000000053613556035664030167 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::NoPrimaryKey; use base qw/DBICTest::BaseResult/; __PACKAGE__->table('noprimarykey'); __PACKAGE__->add_columns( 'foo' => { data_type => 'integer' }, 'bar' => { data_type => 'integer' }, 'baz' => { data_type => 'integer' }, ); __PACKAGE__->add_unique_constraint(foo_bar => [ qw/foo bar/ ]); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/LyricVersion.pm0000644000175000017500000000104713556035664030224 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::LyricVersion; use base qw/DBICTest::BaseResult/; __PACKAGE__->table('lyric_versions'); __PACKAGE__->add_columns( 'id' => { data_type => 'integer', is_auto_increment => 1, }, 'lyric_id' => { data_type => 'integer', is_foreign_key => 1, }, 'text' => { data_type => 'varchar', size => 100, }, ); __PACKAGE__->set_primary_key('id'); __PACKAGE__->add_unique_constraint ([qw/lyric_id text/]); __PACKAGE__->belongs_to('lyric', 'DBICTest::Schema::Lyrics', 'lyric_id'); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/SelfRefAlias.pm0000644000175000017500000000070513556035664030074 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::SelfRefAlias; use base qw/DBICTest::BaseResult/; __PACKAGE__->table('self_ref_alias'); __PACKAGE__->add_columns( 'self_ref' => { data_type => 'integer', }, 'alias' => { data_type => 'integer', }, ); __PACKAGE__->set_primary_key(qw/self_ref alias/); __PACKAGE__->belongs_to( self_ref => 'DBICTest::Schema::SelfRef' ); __PACKAGE__->belongs_to( alias => 'DBICTest::Schema::SelfRef' ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/ComputedColumn.pm0000644000175000017500000000120313556035664030524 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::ComputedColumn; # for sybase and mssql computed column tests use base qw/DBICTest::BaseResult/; __PACKAGE__->table('computed_column_test'); __PACKAGE__->add_columns( 'id' => { data_type => 'integer', is_auto_increment => 1, }, 'a_computed_column' => { data_type => undef, is_nullable => 0, default_value => \'getdate()', }, 'a_timestamp' => { data_type => 'timestamp', is_nullable => 0, }, 'charfield' => { data_type => 'varchar', size => 20, default_value => 'foo', is_nullable => 0, } ); __PACKAGE__->set_primary_key('id'); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/CD_to_Producer.pm0000644000175000017500000000115313556035664030425 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::CD_to_Producer; use base qw/DBICTest::BaseResult/; __PACKAGE__->table('cd_to_producer'); __PACKAGE__->add_columns( cd => { data_type => 'integer' }, producer => { data_type => 'integer' }, attribute => { data_type => 'integer', is_nullable => 1 }, ); __PACKAGE__->set_primary_key(qw/cd producer/); __PACKAGE__->belongs_to( 'cd', 'DBICTest::Schema::CD', { 'foreign.cdid' => 'self.cd' } ); __PACKAGE__->belongs_to( 'producer', 'DBICTest::Schema::Producer', { 'foreign.producerid' => 'self.producer' }, { on_delete => undef, on_update => undef }, ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/ArtistSubclass.pm0000644000175000017500000000022213556035664030534 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::ArtistSubclass; use base 'DBICTest::Schema::Artist'; __PACKAGE__->table(__PACKAGE__->table); 1;DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/TwoKeyTreeLike.pm0000644000175000017500000000124313556035664030441 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::TwoKeyTreeLike; use base qw/DBICTest::BaseResult/; __PACKAGE__->table('twokeytreelike'); __PACKAGE__->add_columns( 'id1' => { data_type => 'integer' }, 'id2' => { data_type => 'integer' }, 'parent1' => { data_type => 'integer' }, 'parent2' => { data_type => 'integer' }, 'name' => { data_type => 'varchar', size => 100, }, ); __PACKAGE__->set_primary_key(qw/id1 id2/); __PACKAGE__->add_unique_constraint('tktlnameunique' => ['name']); __PACKAGE__->belongs_to('parent', 'DBICTest::Schema::TwoKeyTreeLike', { 'foreign.id1' => 'self.parent1', 'foreign.id2' => 'self.parent2'}); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/BooksInLibrary.pm0000644000175000017500000000133413556035664030464 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::BooksInLibrary; use base qw/DBICTest::BaseResult/; __PACKAGE__->table('books'); __PACKAGE__->add_columns( 'id' => { data_type => 'integer', is_auto_increment => 1, }, 'source' => { data_type => 'varchar', size => '100', }, 'owner' => { data_type => 'integer', }, 'title' => { data_type => 'varchar', size => '100', }, 'price' => { data_type => 'integer', is_nullable => 1, }, ); __PACKAGE__->set_primary_key('id'); __PACKAGE__->add_unique_constraint (['title']); __PACKAGE__->resultset_attributes({where => { source => "Library" } }); __PACKAGE__->belongs_to ( owner => 'DBICTest::Schema::Owners', 'owner' ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/CollectionObject.pm0000644000175000017500000000122613556035664031015 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::CollectionObject; use base qw/DBICTest::BaseResult/; __PACKAGE__->table('collection_object'); __PACKAGE__->add_columns( 'collection' => { data_type => 'integer', }, 'object' => { data_type => 'integer', }, ); __PACKAGE__->set_primary_key(qw/collection object/); __PACKAGE__->belongs_to( collection => "DBICTest::Schema::Collection", { "foreign.collectionid" => "self.collection" } ); __PACKAGE__->belongs_to( object => "DBICTest::Schema::TypedObject", { "foreign.objectid" => "self.object" } ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/ArtistSourceName.pm0000644000175000017500000000030313556035664031016 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::ArtistSourceName; use base 'DBICTest::Schema::Artist'; __PACKAGE__->table(__PACKAGE__->table); __PACKAGE__->source_name('SourceNameArtists'); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/EventTZDeprecated.pm0000644000175000017500000000120213556035664031105 0ustar ahartmaiahartmaipackage DBICTest::Schema::EventTZDeprecated; use strict; use warnings; use base qw/DBICTest::BaseResult/; __PACKAGE__->load_components(qw/InflateColumn::DateTime/); __PACKAGE__->table('event'); __PACKAGE__->add_columns( id => { data_type => 'integer', is_auto_increment => 1 }, starts_at => { data_type => 'datetime', extra => { timezone => "America/Chicago", locale => 'de_DE' } }, created_on => { data_type => 'timestamp', extra => { timezone => "America/Chicago", floating_tz_ok => 1 } }, ); __PACKAGE__->set_primary_key('id'); sub _datetime_parser { require DateTime::Format::MySQL; DateTime::Format::MySQL->new(); } 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/Artwork_to_Artist.pm0000644000175000017500000000410213556035664031250 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::Artwork_to_Artist; use base qw/DBICTest::BaseResult/; use Carp qw/confess/; __PACKAGE__->table('artwork_to_artist'); __PACKAGE__->add_columns( 'artwork_cd_id' => { data_type => 'integer', is_foreign_key => 1, }, 'artist_id' => { data_type => 'integer', is_foreign_key => 1, }, ); __PACKAGE__->set_primary_key(qw/artwork_cd_id artist_id/); __PACKAGE__->belongs_to('artwork', 'DBICTest::Schema::Artwork', 'artwork_cd_id'); __PACKAGE__->belongs_to('artist', 'DBICTest::Schema::Artist', 'artist_id'); __PACKAGE__->belongs_to('artist_test_m2m', 'DBICTest::Schema::Artist', sub { my $args = shift; # This is for test purposes only. A regular user does not # need to sanity check the passed-in arguments, this is what # the tests are for :) my @missing_args = grep { ! defined $args->{$_} } qw/self_alias foreign_alias self_resultsource foreign_relname/; confess "Required arguments not supplied to custom rel coderef: @missing_args\n" if @missing_args; return ( { "$args->{foreign_alias}.artistid" => { -ident => "$args->{self_alias}.artist_id" }, "$args->{foreign_alias}.rank" => { '<' => 10 }, }, $args->{self_rowobj} && { "$args->{foreign_alias}.artistid" => $args->{self_rowobj}->artist_id, "$args->{foreign_alias}.rank" => { '<' => 10 }, } ); } ); __PACKAGE__->belongs_to('artist_test_m2m_noopt', 'DBICTest::Schema::Artist', sub { my $args = shift; # This is for test purposes only. A regular user does not # need to sanity check the passed-in arguments, this is what # the tests are for :) my @missing_args = grep { ! defined $args->{$_} } qw/self_alias foreign_alias self_resultsource foreign_relname/; confess "Required arguments not supplied to custom rel coderef: @missing_args\n" if @missing_args; return ( { "$args->{foreign_alias}.artistid" => { -ident => "$args->{self_alias}.artist_id" }, "$args->{foreign_alias}.rank" => { '<' => 10 }, } ); } ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/FourKeys_to_TwoKeys.pm0000644000175000017500000000174013556035664031532 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::FourKeys_to_TwoKeys; use base qw/DBICTest::BaseResult/; __PACKAGE__->table('fourkeys_to_twokeys'); __PACKAGE__->add_columns( 'f_foo' => { data_type => 'integer' }, 'f_bar' => { data_type => 'integer' }, 'f_hello' => { data_type => 'integer' }, 'f_goodbye' => { data_type => 'integer' }, 't_artist' => { data_type => 'integer' }, 't_cd' => { data_type => 'integer' }, 'autopilot' => { data_type => 'character' }, 'pilot_sequence' => { data_type => 'integer', is_nullable => 1 }, ); __PACKAGE__->set_primary_key( qw/f_foo f_bar f_hello f_goodbye t_artist t_cd/ ); __PACKAGE__->belongs_to('fourkeys', 'DBICTest::Schema::FourKeys', { 'foreign.foo' => 'self.f_foo', 'foreign.bar' => 'self.f_bar', 'foreign.hello' => 'self.f_hello', 'foreign.goodbye' => 'self.f_goodbye', }); __PACKAGE__->belongs_to('twokeys', 'DBICTest::Schema::TwoKeys', { 'foreign.artist' => 'self.t_artist', 'foreign.cd' => 'self.t_cd', }); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/TimestampPrimaryKey.pm0000644000175000017500000000047513556035664031560 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::TimestampPrimaryKey; use base qw/DBICTest::BaseResult/; __PACKAGE__->table('timestamp_primary_key_test'); __PACKAGE__->add_columns( 'id' => { data_type => 'timestamp', default_value => \'current_timestamp', }, ); __PACKAGE__->set_primary_key('id'); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/ArtistUndirectedMap.pm0000644000175000017500000000132613556035664031507 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::ArtistUndirectedMap; use base qw/DBICTest::BaseResult/; __PACKAGE__->table('artist_undirected_map'); __PACKAGE__->add_columns( 'id1' => { data_type => 'integer' }, 'id2' => { data_type => 'integer' }, ); __PACKAGE__->set_primary_key(qw/id1 id2/); __PACKAGE__->belongs_to( 'artist1', 'DBICTest::Schema::Artist', 'id1', { on_delete => 'RESTRICT', on_update => 'CASCADE'} ); __PACKAGE__->belongs_to( 'artist2', 'DBICTest::Schema::Artist', 'id2', { on_delete => undef, on_update => undef} ); __PACKAGE__->has_many( 'mapped_artists', 'DBICTest::Schema::Artist', [ {'foreign.artistid' => 'self.id1'}, {'foreign.artistid' => 'self.id2'} ], { cascade_delete => 0 }, ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Schema/PunctuatedColumnName.pm0000644000175000017500000000113013556035664031660 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Schema::PunctuatedColumnName; use base qw/DBICTest::BaseResult/; __PACKAGE__->table('punctuated_column_name'); __PACKAGE__->add_columns( 'id' => { data_type => 'integer', is_auto_increment => 1, }, q{foo ' bar} => { data_type => 'integer', is_nullable => 1, accessor => 'foo_bar', }, q{bar/baz} => { data_type => 'integer', is_nullable => 1, accessor => 'bar_baz', }, q{baz;quux} => { data_type => 'integer', is_nullable => 1, accessor => 'bar_quux', }, ); __PACKAGE__->set_primary_key('id'); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/BaseResult.pm0000644000175000017500000000147013556035664026445 0ustar ahartmaiahartmaipackage #hide from pause DBICTest::BaseResult; use strict; use warnings; # must load before any DBIx::Class* namespaces use DBICTest::RunMode; use base 'DBIx::Class::Core'; #use base qw/DBIx::Class::Relationship::Cascade::Rekey DBIx::Class::Core/; __PACKAGE__->table ('bogus'); __PACKAGE__->resultset_class ('DBICTest::BaseResultSet'); #sub add_relationship { # my $self = shift; # my $opts = $_[3] || {}; # if (grep { $_ eq $_[0] } qw/ # cds_90s cds_80s cds_84 artist_undirected_maps mapped_artists last_track # /) { # # nothing - join-dependent or non-cascadeable relationship # } # elsif ($opts->{is_foreign_key_constraint}) { # $opts->{on_update} ||= 'cascade'; # } # else { # $opts->{cascade_rekey} = 1 # unless ref $_[2] eq 'CODE'; # } # $self->next::method(@_[0..2], $opts); #} 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/BaseSchema.pm0000644000175000017500000000027013556035664026364 0ustar ahartmaiahartmaipackage #hide from pause DBICTest::BaseSchema; use strict; use warnings; # must load before any DBIx::Class* namespaces use DBICTest::RunMode; use base 'DBIx::Class::Schema'; 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/FakeComponent.pm0000644000175000017500000000020313556035664027116 0ustar ahartmaiahartmai# belongs to t/run/90ensure_class_loaded.tl package # hide from PAUSE DBICTest::FakeComponent; use warnings; use strict; 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/BaseResultSet.pm0000644000175000017500000000046613556035664027125 0ustar ahartmaiahartmaipackage #hide from pause DBICTest::BaseResultSet; use strict; use warnings; # must load before any DBIx::Class* namespaces use DBICTest::RunMode; use base 'DBIx::Class::ResultSet'; sub all_hri { return [ shift->search ({}, { result_class => 'DBIx::Class::ResultClass::HashRefInflator' })->all ]; } 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/ErrorComponent.pm0000644000175000017500000000024313556035664027345 0ustar ahartmaiahartmai# belongs to t/run/90ensure_class_loaded.tl package # hide from PAUSE DBICTest::ErrorComponent; use warnings; use strict; # this is missing on purpose # 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/DeployComponent.pm0000644000175000017500000000040713556035664027512 0ustar ahartmaiahartmai# belongs to t/86sqlt.t package # hide from PAUSE DBICTest::DeployComponent; use warnings; use strict; our $hook_cb; sub sqlt_deploy_hook { my $class = shift; $hook_cb->($class, @_) if $hook_cb; $class->next::method(@_) if $class->next::can; } 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/ForeignComponent.pm0000644000175000017500000000034313556035664027646 0ustar ahartmaiahartmai# belongs to t/05components.t package # hide from PAUSE DBICTest::ForeignComponent; use warnings; use strict; use base qw/ DBIx::Class /; __PACKAGE__->load_components( qw/ +DBICTest::ForeignComponent::TestComp / ); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/ResultSetManager.pm0000644000175000017500000000020213556035664027611 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::ResultSetManager; use base 'DBICTest::BaseSchema'; __PACKAGE__->load_classes("Foo"); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/OptionalComponent.pm0000644000175000017500000000020713556035664030041 0ustar ahartmaiahartmai# belongs to t/run/90ensure_class_loaded.tl package # hide from PAUSE DBICTest::OptionalComponent; use warnings; use strict; 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Taint/0000775000175000017500000000000013556035664025115 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Taint/Classes/0000775000175000017500000000000013556035664026512 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Taint/Classes/Auto.pm0000644000175000017500000000017513556035664027761 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Taint::Classes::Auto; use base 'DBIx::Class::Core'; __PACKAGE__->table('test'); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Taint/Classes/Manual.pm0000644000175000017500000000017713556035664030270 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Taint::Classes::Manual; use base 'DBIx::Class::Core'; __PACKAGE__->table('test'); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Taint/Namespaces/0000775000175000017500000000000013556035664027174 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Taint/Namespaces/Result/0000775000175000017500000000000013556035664030452 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Taint/Namespaces/Result/Test.pm0000644000175000017500000000021013556035664031716 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::Taint::Namespaces::Result::Test; use base 'DBIx::Class::Core'; __PACKAGE__->table('test'); 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/ResultSetManager/0000775000175000017500000000000013556035664027263 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/ResultSetManager/Foo.pm0000644000175000017500000000032313556035664030340 0ustar ahartmaiahartmaipackage # hide from PAUSE DBICTest::ResultSetManager::Foo; use base 'DBIx::Class::Core'; __PACKAGE__->load_components(qw/ ResultSetManager /); __PACKAGE__->table('foo'); sub bar : ResultSet { 'good' } 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Util/0000775000175000017500000000000013556035664024753 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/Util/OverrideRequire.pm0000644000175000017500000000776713556035664030444 0ustar ahartmaiahartmaipackage DBICTest::Util::OverrideRequire; # no use/require of any kind - work bare BEGIN { # Neat STDERR require call tracer # # 0 - no trace # 1 - just requires and return values # 2 - neat stacktrace (assumes that the supplied $override_cref does *not* (ab)use goto) # 3 - full stacktrace *TRACE = sub () { 0 }; } # Takes a single coderef and replaces CORE::GLOBAL::require with it. # # On subsequent require() calls, the coderef will be invoked with # two arguments - ($next_require, $module_name_copy) # # $next_require is a coderef closing over the module name. It needs # to be invoked at some point without arguments for the actual # require to take place (this way your coderef in essence becomes an # around modifier) # # $module_name_copy is a string-copy of what $next_require is closing # over. The reason for the copy is that you may trigger a side effect # on magical values, and subsequently abort the require (e.g. # require v.5.8.8 magic) # # All of this almost verbatim copied from Lexical::SealRequireHints # Zefram++ sub override_global_require (&) { my $override_cref = shift; our $next_require = defined(&CORE::GLOBAL::require) ? \&CORE::GLOBAL::require : sub { my ($arg) = @_; # The shenanigans with $CORE::GLOBAL::{require} # are required because if there's a # &CORE::GLOBAL::require when the eval is # executed then the CORE::require in there is # interpreted as plain require on some Perl # versions, leading to recursion. my $grequire = delete $CORE::GLOBAL::{require}; my $res = eval sprintf ' local $SIG{__DIE__}; $CORE::GLOBAL::{require} = $grequire; package %s; CORE::require($arg); ', scalar caller(0); # the caller already had its package replaced my $err = $@ if $@ ne ''; if( TRACE ) { if (TRACE == 1) { printf STDERR "Require of '%s' (returned: '%s')\n", (my $m_copy = $arg), (my $r_copy = $res), ; } else { my ($fr_num, @fr, @tr, $excise); while (@fr = caller($fr_num++)) { # Package::Stash::XS is a cock and gets mightily confused if one # uses a regex in the require hook. Even though it happens only # on < 5.8.7 it's still rather embarassing (also wtf does P::S::XS # even need to regex its own module name?!). So we do not use re :) if (TRACE == 3 or (index($fr[1], '(eval ') != 0 and index($fr[1], __FILE__) != 0) ) { push @tr, [@fr] } # the caller before this would be the override site - kill it away # if the cref writer uses goto - well tough, tracer won't work if ($fr[3] eq 'DBICTest::Util::OverrideRequire::__ANON__') { $excise ||= $tr[-2] if TRACE == 2; } } my @stack = map { "$_->[1], line $_->[2]" } grep { ! $excise or $_->[1] ne $excise->[1] or $_->[2] ne $excise->[2] } @tr ; printf STDERR "Require of '%s' (returned: '%s')\n%s\n\n", (my $m_copy = $arg), (my $r_copy = $res||''), join "\n", (map { " $_" } @stack) ; } } die $err if defined $err; return $res; } ; # Need to suppress the redefinition warning, without # invoking warnings.pm. BEGIN { ${^WARNING_BITS} = ""; } *CORE::GLOBAL::require = sub { die "wrong number of arguments to require\n" unless @_ == 1; # the copy is to prevent accidental overload firing (e.g. require v5.8.8) my ($arg_copy) = our ($arg) = @_; return $override_cref->(sub { die "The require delegate takes no arguments\n" if @_; my $res = eval sprintf ' local $SIG{__DIE__}; package %s; $next_require->($arg); ', scalar caller(2); # 2 for the indirection of the $override_cref around die $@ if $@ ne ''; return $res; }, $arg_copy); } } 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/SyntaxErrorComponent3.pm0000644000175000017500000000010613556035664030635 0ustar ahartmaiahartmaipackage DBICErrorTest::SyntaxError; use strict; I'm a syntax error! DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/SyntaxErrorComponent2.pm0000644000175000017500000000024713556035664030642 0ustar ahartmaiahartmai# belongs to t/run/90ensure_class_loaded.tl package # hide from PAUSE DBICTest::SyntaxErrorComponent2; use warnings; use strict; my $str ''; # syntax error 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/SyntaxErrorComponent1.pm0000644000175000017500000000024713556035664030641 0ustar ahartmaiahartmai# belongs to t/run/90ensure_class_loaded.tl package # hide from PAUSE DBICTest::SyntaxErrorComponent1; use warnings; use strict; my $str ''; # syntax error 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/ForeignComponent/0000775000175000017500000000000013556035664027312 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/t_dbic/lib/DBICTest/ForeignComponent/TestComp.pm0000644000175000017500000000024113556035664031401 0ustar ahartmaiahartmai# belongs to t/05components.t package # hide from PAUSE DBICTest::ForeignComponent::TestComp; use warnings; use strict; sub foreign_test_method { 1 } 1; DBIx-Class-ResultSet-RecursiveUpdate-0.40/lib/0000775000175000017500000000000013556035664021251 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/lib/DBIx/0000775000175000017500000000000013556035664022037 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/lib/DBIx/Class/0000775000175000017500000000000013556035664023104 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/lib/DBIx/Class/ResultSet/0000775000175000017500000000000013556035664025036 5ustar ahartmaiahartmaiDBIx-Class-ResultSet-RecursiveUpdate-0.40/lib/DBIx/Class/ResultSet/RecursiveUpdate.pm0000644000175000017500000012123713556035664030512 0ustar ahartmaiahartmaiuse strict; use warnings; package DBIx::Class::ResultSet::RecursiveUpdate; $DBIx::Class::ResultSet::RecursiveUpdate::VERSION = '0.40'; # ABSTRACT: like update_or_create - but recursive use base qw(DBIx::Class::ResultSet); sub recursive_update { my ( $self, $updates, $attrs ) = @_; my $fixed_fields; my $unknown_params_ok; my $m2m_force_set_rel; # 0.21+ api if ( defined $attrs && ref $attrs eq 'HASH' ) { $fixed_fields = $attrs->{fixed_fields}; $unknown_params_ok = $attrs->{unknown_params_ok}; $m2m_force_set_rel = $attrs->{m2m_force_set_rel}; } # pre 0.21 api elsif ( defined $attrs && ref $attrs eq 'ARRAY' ) { $fixed_fields = $attrs; } return DBIx::Class::ResultSet::RecursiveUpdate::Functions::recursive_update( resultset => $self, updates => $updates, fixed_fields => $fixed_fields, unknown_params_ok => $unknown_params_ok, m2m_force_set_rel => $m2m_force_set_rel, ); } package DBIx::Class::ResultSet::RecursiveUpdate::Functions; $DBIx::Class::ResultSet::RecursiveUpdate::Functions::VERSION = '0.40'; use Carp::Clan qw/^DBIx::Class|^HTML::FormHandler|^Try::Tiny/; use Scalar::Util qw( blessed ); use List::MoreUtils qw/ any all none /; use Try::Tiny; use Data::Dumper::Concise; use constant DEBUG => 0; sub recursive_update { my %params = @_; my ( $self, $updates, $fixed_fields, $object, $resolved, $if_not_submitted, $unknown_params_ok, $m2m_force_set_rel ) = @params{ qw/resultset updates fixed_fields object resolved if_not_submitted unknown_params_ok m2m_force_set_rel/ }; $resolved ||= {}; $ENV{DBIC_NULLABLE_KEY_NOWARN} = 1; my $source = $self->result_source; croak "first parameter needs to be defined" unless defined $updates; croak "first parameter needs to be a hashref" unless ref($updates) eq 'HASH'; croak 'fixed fields needs to be an arrayref' if defined $fixed_fields && ref $fixed_fields ne 'ARRAY'; DEBUG and warn "recursive_update: " . $source->name . "\n"; DEBUG and warn "object passed, skipping find" . (defined $object->id ? " (id " . $object->id . ")\n" : "\n") if defined $object; # always warn about additional parameters if storage debugging is enabled $unknown_params_ok = 0 if $source->storage->debug; if ( blessed($updates) && $updates->isa('DBIx::Class::Row') ) { return $updates; } my @pks = $source->primary_columns; my %pk_kvs; for my $colname (@pks) { if (exists $updates->{$colname} && defined $updates->{$colname}) { $pk_kvs{$colname} = $updates->{$colname}; next; } $pk_kvs{$colname} = $resolved->{$colname} if exists $resolved->{$colname} && defined $resolved->{$colname}; } # support the special case where a method on the related row # populates one or more primary key columns and we don't have # all primary key values already # see DBSchema::Result::DVD relationship keysbymethod DEBUG and warn "pk columns so far: " . join (', ', sort keys %pk_kvs) . "\n"; my @non_pk_columns = grep { my $colname = $_; none { $colname eq $_ } keys %pk_kvs } sort keys %$updates; DEBUG and warn "non-pk columns: " . join (', ', @non_pk_columns) . "\n"; if ( scalar keys %pk_kvs != scalar @pks && @non_pk_columns) { DEBUG and warn "not all primary keys available, trying " . "object creation\n"; # new_result throws exception if non column values are passed # because we want to also support e.g. a BUILDARGS method that # populates primary key columns from an additional value # filter out all relationships my @non_rel_columns = grep { !is_m2m( $self, $_ ) && !$source->has_relationship($_) } sort keys %$updates; my %non_rel_updates = map { $_ => $updates->{$_} } @non_rel_columns; # transform columns specified by their accessor name my %columns_by_accessor = _get_columns_by_accessor($self); for my $accessor_name (sort keys %columns_by_accessor) { my $colname = $columns_by_accessor{$accessor_name}->{name}; if ($accessor_name ne $colname && exists $non_rel_updates{$accessor_name}) { DEBUG and warn "renaming column accessor " . "'$accessor_name' to column name '$colname'\n"; $non_rel_updates{$colname} = delete $non_rel_updates{$accessor_name}; } } DEBUG and warn "using all non-rel updates for object " . "construction: " . Dumper(\%non_rel_updates); # the object creation might fail because of non-column and # non-constructor handled parameters which shouldn't break RU try { my $row = $self->new_result(\%non_rel_updates); for my $colname (@pks) { next if exists $pk_kvs{$colname}; if ($row->can($colname) && defined $row->$colname) { DEBUG and warn "missing pk column $colname exists " . "and defined on object\n"; $pk_kvs{$colname} = $row->$colname; } else { DEBUG and warn "missing pk column $colname doesn't " . "exist or isn't defined on object, aborting\n"; last; } } } catch { DEBUG and warn "object construction failed, ignoring: $_\n"; }; } # check if row can be found in resultset cache if ( !defined $object && scalar keys %pk_kvs == scalar @pks ) { my $cached_rows = $self->get_cache; if (defined $cached_rows) { DEBUG and warn "find in cache\n"; $object = _get_matching_row(\%pk_kvs, $cached_rows) } } $updates = { %$updates, %$resolved }; my %fixed_fields = map { $_ => 1 } @$fixed_fields; # add the resolved columns to the updates hashref my %all_pks = ( %pk_kvs, %fixed_fields ); if ( !defined $object && scalar keys %all_pks == scalar @pks) { DEBUG and warn "find by pk\n"; $object = $self->find( \%all_pks, { key => 'primary' } ); } unless (defined $object) { DEBUG and warn "create new row\n"; $object = $self->new_result( {} ); } # direct column accessors my %columns; # relations that that should be done before the row is inserted into the # database like belongs_to my %pre_updates; # relations that that should be done after the row is inserted into the # database like has_many, might_have and has_one my %post_updates; my %other_methods; my %m2m_accessors; my %columns_by_accessor = _get_columns_by_accessor($self); # this section determines to what each key/value pair maps to, # column or relationship for my $name ( sort keys %$updates ) { DEBUG and warn "updating $name to " . ($updates->{$name} // '[undef]') . "\n"; # columns if ( exists $columns_by_accessor{$name} && !( $source->has_relationship($name) && ref( $updates->{$name} ) ) ) { $columns{$name} = $updates->{$name}; next; } # relationships if ( $source->has_relationship($name) ) { if ( _master_relation_cond( $self, $name ) ) { $pre_updates{$name} = $updates->{$name}; next; } else { $post_updates{$name} = $updates->{$name}; next; } } # many-to-many helper accessors if ( is_m2m( $self, $name ) ) { DEBUG and warn "is m2m\n"; # Transform m2m data into recursive has_many data # if IntrospectableM2M is in use. # # This removes the overhead related to deleting and # re-adding all relationships. if ( !$m2m_force_set_rel && $source->result_class->can('_m2m_metadata') ) { my $meta = $source->result_class->_m2m_metadata->{$name}; my $bridge_rel = $meta->{relation}; my $foreign_rel = $meta->{foreign_relation}; $post_updates{$bridge_rel} = [ map { { $foreign_rel => $_ } } @{ $updates->{$name} } ]; } # Fall back to set_$rel if IntrospectableM2M # is not available. (removing and re-adding all relationships) else { $m2m_accessors{$name} = $updates->{$name}; } next; } # accessors if ( $object->can($name) && not $source->has_relationship($name) ) { $other_methods{$name} = $updates->{$name}; next; } # unknown # don't throw a warning instead of an exception to give users # time to adapt to the new API carp( "No such column, relationship, many-to-many helper accessor or " . "generic accessor '$name' on '" . $source->name . "'" ) unless $unknown_params_ok; } # first update columns and other accessors # so that later related records can be found for my $name ( sort keys %columns ) { $object->$name( $columns{$name} ); } for my $name ( sort keys %other_methods ) { $object->$name( $other_methods{$name} ); } for my $name ( sort keys %pre_updates ) { _update_relation( $self, $name, $pre_updates{$name}, $object, $if_not_submitted, 0 ); } # $self->_delete_empty_auto_increment($object); # don't allow insert to recurse to related objects # do the recursion ourselves # $object->{_rel_in_storage} = 1; # Update if %other_methods because of possible custom update method my $in_storage = $object->in_storage; # preserve related resultsets as DBIx::Class::Row->update clears them # yes, this directly accesses a row attribute, but no API exists and in # the hope to get the recursive_update feature into core DBIx::Class this # is the easiest solution my $related_resultsets = $object->{related_resultsets}; $object->update_or_insert if ( $object->is_changed || keys %other_methods ); # restore related resultsets $object->{related_resultsets} = $related_resultsets; # this is needed to populate all columns in the row object because # otherwise _resolve_condition in _update_relation fails if a foreign key # column isn't loaded if (not $in_storage) { DEBUG and warn "discard_changes for created row\n"; $object->discard_changes; } # updating many_to_many for my $name ( sort keys %m2m_accessors ) { DEBUG and warn "updating m2m $name\n"; my $value = $m2m_accessors{$name}; # TODO: only first pk col is used my ($pk) = _get_pk_for_related( $self, $name ); my @rows; my $rel_source = $object->$name->result_source; my @updates; if ( defined $value && ref $value eq 'ARRAY' ) { @updates = @{$value}; } elsif ( defined $value && !ref $value ) { @updates = ($value); } elsif ( defined $value ) { carp "value of many-to-many rel '$name' must be an arrayref or scalar: $value"; } for my $elem (@updates) { if ( blessed($elem) && $elem->isa('DBIx::Class::Row') ) { push @rows, $elem; } elsif ( ref $elem eq 'HASH' ) { push @rows, recursive_update( resultset => $rel_source->resultset, updates => $elem ); } else { push @rows, $rel_source->resultset->find( { $pk => $elem } ); } } my $set_meth = 'set_' . $name; $object->$set_meth( \@rows ); } for my $name ( sort keys %post_updates ) { _update_relation( $self, $name, $post_updates{$name}, $object, $if_not_submitted, $in_storage ); } delete $ENV{DBIC_NULLABLE_KEY_NOWARN}; return $object; } # returns DBIx::Class::ResultSource::column_info as a hash indexed by column accessor || name sub _get_columns_by_accessor { my $self = shift; my $source = $self->result_source; my %columns; for my $name ( $source->columns ) { my $info = $source->column_info($name); $info->{name} = $name; $columns{ $info->{accessor} || $name } = $info; } return %columns; } sub _get_matching_row { my ($kvs, $rows) = @_; return unless defined $rows; croak 'key/value need to be a hashref' unless ref $kvs eq 'HASH'; croak 'key/value needs to have at least one pair' if keys %$kvs == 0; croak 'rows need to be an arrayref' unless ref $rows eq 'ARRAY'; unless ($rows) { DEBUG and warn "skipping because no rows passed\n"; return; } my $matching_row; my @matching_rows; for my $row (@$rows) { push @matching_rows, $row if all { $kvs->{$_} eq $row->get_column($_) } grep { !ref $kvs->{$_} } sort keys %$kvs; } DEBUG and warn "multiple matching rows: " . scalar @matching_rows . "\n" if @matching_rows > 1; $matching_row = $matching_rows[0] if scalar @matching_rows == 1; DEBUG and warn "matching row found for: " . Dumper($kvs) . " in " . Dumper([map { { $_->get_columns } } @$rows]) . "\n" if defined $matching_row; DEBUG and warn "matching row not found for: " . Dumper($kvs) . " in " . Dumper([map { { $_->get_columns } } @$rows]) . "\n" unless defined $matching_row; return $matching_row; } # Arguments: $rs, $name, $updates, $row, $if_not_submitted, $row_existed sub _update_relation { my ( $self, $name, $updates, $object, $if_not_submitted, $row_existed ) = @_; # this should never happen because we're checking the paramters passed to # recursive_update, but just to be sure... $object->throw_exception("No such relationship '$name'") unless $object->has_relationship($name); DEBUG and warn "_update_relation: $name\n"; my $info = $object->result_source->relationship_info($name); my $attrs = $info->{attrs}; # get a related resultset without a condition my $related_resultset = $self->related_resultset($name)->result_source->resultset; my $resolved; if ( $self->result_source->can('_resolve_condition') ) { $resolved = $self->result_source->_resolve_condition( $info->{cond}, $name, $object, $name ); } else { $self->throw_exception("result_source must support _resolve_condition"); } $resolved = {} if defined $DBIx::Class::ResultSource::UNRESOLVABLE_CONDITION && $DBIx::Class::ResultSource::UNRESOLVABLE_CONDITION == $resolved; # This is a hack. I'm not sure that this will handle most # custom code conditions yet. This needs tests. my @rel_cols; if ( ref $info->{cond} eq 'CODE' ) { my $new_resolved; # remove 'me.' from keys in returned hashref while ( my ( $key, $value ) = each %$resolved ) { $key =~ s/^me\.//; $new_resolved->{$key} = $value; push @rel_cols, $key; } $resolved = $new_resolved; } else { @rel_cols = sort keys %{ $info->{cond} }; map { s/^foreign\.// } @rel_cols; } # find out if all related columns are nullable my $all_fks_nullable = 1; for my $rel_col (@rel_cols) { $all_fks_nullable = 0 unless $related_resultset->result_source->column_info($rel_col)->{is_nullable}; } $if_not_submitted = $all_fks_nullable ? 'set_to_null' : 'delete' unless defined $if_not_submitted; # the only valid datatype for a has_many rels is an arrayref if ( $attrs->{accessor} eq 'multi' ) { DEBUG and warn "has_many: $name\n"; # handle undef like empty arrayref $updates = [] unless defined $updates; $self->throw_exception("data for has_many relationship '$name' must be an arrayref") unless ref $updates eq 'ARRAY'; my @updated_objs; my @related_rows; # newly created rows can't have related rows if ($row_existed) { @related_rows = $object->$name; DEBUG and warn "got related rows: " . scalar @related_rows . "\n"; } my $related_result_source = $related_resultset->result_source; my @pks = $related_result_source->primary_columns; for my $sub_updates ( @{$updates} ) { DEBUG and warn "updating related row\n"; my %pk_kvs; # detect the special case where the primary key of a currently not # related row is passed in the updates hash for my $colname (@pks) { if (exists $sub_updates->{$colname} && defined $sub_updates->{$colname}) { $pk_kvs{$colname} = $sub_updates->{$colname}; next; } $pk_kvs{$colname} = $resolved->{$colname} if exists $resolved->{$colname} && defined $resolved->{$colname}; } my $related_object; # support the special case where a method on the related row # populates one or more primary key columns and we don't have # all primary key values already # see DBSchema::Result::DVD relationship keysbymethod DEBUG and warn "pk columns so far: " . join (', ', sort keys %pk_kvs) . "\n"; my @non_pk_columns = grep { my $colname = $_; none { $colname eq $_ } keys %pk_kvs } sort keys %$sub_updates; DEBUG and warn "non-pk columns: " . join (', ', @non_pk_columns) . "\n"; if ( scalar keys %pk_kvs != scalar @pks && @non_pk_columns) { DEBUG and warn "not all primary keys available, trying " . "object creation\n"; # new_result throws exception if non column values are passed # because we want to also support e.g. a BUILDARGS method that # populates primary key columns from an additional value # filter out all relationships my @non_rel_columns = grep { !is_m2m( $related_resultset, $_ ) && !$related_result_source->has_relationship($_) } sort keys %$sub_updates; my %non_rel_updates = map { $_ => $sub_updates->{$_} } @non_rel_columns; # transform columns specified by their accessor name my %columns_by_accessor = _get_columns_by_accessor($related_resultset); for my $accessor_name (sort keys %columns_by_accessor) { my $colname = $columns_by_accessor{$accessor_name}->{name}; if ($accessor_name ne $colname && exists $non_rel_updates{$accessor_name}) { DEBUG and warn "renaming column accessor " . "'$accessor_name' to column name '$colname'\n"; $non_rel_updates{$colname} = delete $non_rel_updates{$accessor_name}; } } DEBUG and warn "using all non-rel updates for object " . "construction: " . Dumper(\%non_rel_updates); # the object creation might fail because of non-column and # non-constructor handled parameters which shouldn't break RU try { my $related_row = $related_resultset ->new_result(\%non_rel_updates); for my $colname (@pks) { next if exists $pk_kvs{$colname}; if ($related_row->can($colname) && defined $related_row->$colname) { DEBUG and warn "missing pk column $colname exists " . "and defined on object\n"; $pk_kvs{$colname} = $related_row->$colname; } else { DEBUG and warn "missing pk column $colname doesn't " . "exist or isn't defined on object, aborting\n"; last; } } } catch { DEBUG and warn "object construction failed, ignoring: $_\n"; }; } if ( scalar keys %pk_kvs == scalar @pks ) { DEBUG and warn "all primary keys available\n"; # the lookup can fail if the primary key of a currently not # related row is passed in the updates hash $related_object = _get_matching_row(\%pk_kvs, \@related_rows); } # pass an empty object if no related row found and it's not the # special case where the primary key of a currently not related # row is passed in the updates hash to prevent the find by pk in # recursive_update to happen else { DEBUG and warn "passing empty row to prevent find by pk\n"; $related_object = $related_resultset->new_result({}); } my $sub_object = recursive_update( resultset => $related_resultset, updates => $sub_updates, resolved => $resolved, # pass prefetched object if found object => $related_object, ); push @updated_objs, $sub_object; } # determine if a removal query is required my @remove_rows = grep { my $existing_row = $_; none { $existing_row->ID eq $_->ID } @updated_objs } @related_rows; DEBUG and warn "rows for removal: " . join(', ', map { $_->ID } @remove_rows) . "\n"; if (scalar @remove_rows) { my $rs_rel_delist = $object->$name; # foreign table has a single pk column if (scalar @pks == 1) { DEBUG and warn "delete in not_in\n"; $rs_rel_delist = $rs_rel_delist->search_rs( { $self->current_source_alias . "." . $pks[0] => { -not_in => [ map ( $_->id, @updated_objs ) ] } } ); } # foreign table has multiple pk columns else { my @cond; for my $obj (@updated_objs) { my %cond_for_obj; for my $col (@pks) { $cond_for_obj{ $self->current_source_alias . ".$col" } = $obj->get_column($col); } push @cond, \%cond_for_obj; } # only limit resultset if there are related rows left if (scalar @cond) { $rs_rel_delist = $rs_rel_delist->search_rs({ -not => [ @cond ] }); } } if ($if_not_submitted eq 'delete') { $rs_rel_delist->delete; } elsif ($if_not_submitted eq 'set_to_null') { my %update = map {$_ => undef} @rel_cols; $rs_rel_delist->update(\%update); } } } elsif ( $attrs->{accessor} eq 'single' || $attrs->{accessor} eq 'filter' ) { DEBUG and warn "has_one, might_have, belongs_to (" . $attrs->{accessor} . "): $name\n"; my $sub_object; if ( ref $updates ) { my $existing_row = 0; my @pks = $related_resultset->result_source->primary_columns; if ( all { exists $updates->{$_} && defined $updates->{$_} } @pks ) { $existing_row = 1; } DEBUG and warn $existing_row ? "existing row\n" : "new row\n"; if ( blessed($updates) && $updates->isa('DBIx::Class::Row') ) { $sub_object = $updates; } elsif ( $attrs->{accessor} eq 'single' && defined $object->$name ) { $sub_object = recursive_update( resultset => $related_resultset, updates => $updates, $existing_row ? () : (object => $object->$name), ); } else { $sub_object = recursive_update( resultset => $related_resultset, updates => $updates, $existing_row ? () : (resolved => $resolved), ); } } else { $sub_object = $related_resultset->find($updates) unless ( !$updates && ( exists $attrs->{join_type} && $attrs->{join_type} eq 'LEFT' ) ); } my $join_type = $attrs->{join_type} || ''; # unmarked 'LEFT' join for belongs_to my $might_belong_to = ( $attrs->{accessor} eq 'single' || $attrs->{accessor} eq 'filter' ) && $attrs->{is_foreign_key_constraint}; # adding check for custom condition that's a coderef # this 'set_from_related' should probably not be called in lots of other # situations too, but until that's worked out, kludge it if ( ( $sub_object || $updates || $might_belong_to || $join_type eq 'LEFT' ) && ref $info->{cond} ne 'CODE' ) { $object->$name($sub_object); } } else { $self->throw_exception( "recursive_update doesn't now how to handle relationship '$name' with accessor " . $info->{attrs}{accessor} ); } } sub is_m2m { my ( $self, $relation ) = @_; my $rclass = $self->result_class; # DBIx::Class::IntrospectableM2M if ( $rclass->can('_m2m_metadata') ) { return $rclass->_m2m_metadata->{$relation}; } my $object = $self->new_result( {} ); if ( $object->can($relation) and !$self->result_source->has_relationship($relation) and $object->can( 'set_' . $relation ) ) { return 1; } return; } sub get_m2m_source { my ( $self, $relation ) = @_; my $rclass = $self->result_class; # DBIx::Class::IntrospectableM2M if ( $rclass->can('_m2m_metadata') ) { return $self->result_source->related_source( $rclass->_m2m_metadata->{$relation}{relation} ) ->related_source( $rclass->_m2m_metadata->{$relation}{foreign_relation} ); } my $object = $self->new_result( {} ); my $r = $object->$relation; return $r->result_source; } sub _delete_empty_auto_increment { my ( $self, $object ) = @_; for my $col ( sort keys %{ $object->{_column_data} } ) { if ( $object->result_source->column_info($col)->{is_auto_increment} and ( !defined $object->{_column_data}{$col} or $object->{_column_data}{$col} eq '' ) ) { delete $object->{_column_data}{$col}; } } } sub _get_pk_for_related { my ( $self, $relation ) = @_; my $source; if ( $self->result_source->has_relationship($relation) ) { $source = $self->result_source->related_source($relation); } # many to many case if ( is_m2m( $self, $relation ) ) { $source = get_m2m_source( $self, $relation ); } return $source->primary_columns; } # This function determines whether a relationship should be done before or # after the row is inserted into the database # relationships before: belongs_to # relationships after: has_many, might_have and has_one # true means before, false after sub _master_relation_cond { my ( $self, $name ) = @_; my $source = $self->result_source; my $info = $source->relationship_info($name); # has_many rels are always after return 0 if $info->{attrs}->{accessor} eq 'multi'; my @foreign_ids = _get_pk_for_related( $self, $name ); my $cond = $info->{cond}; sub _inner { my ( $source, $cond, @foreign_ids ) = @_; while ( my ( $f_key, $col ) = each %{$cond} ) { # might_have is not master $col =~ s/^self\.//; $f_key =~ s/^foreign\.//; if ( $source->column_info($col)->{is_auto_increment} ) { return 0; } if ( any { $_ eq $f_key } @foreign_ids ) { return 1; } } return 0; } if ( ref $cond eq 'HASH' ) { return _inner( $source, $cond, @foreign_ids ); } # arrayref of hashrefs elsif ( ref $cond eq 'ARRAY' ) { for my $new_cond (@$cond) { return _inner( $source, $new_cond, @foreign_ids ); } } # we have a custom join condition, so update afterward elsif ( ref $cond eq 'CODE' ) { return 0; } else { $source->throw_exception( "unhandled relation condition " . ref($cond) ); } return; } 1; __END__ =pod =encoding UTF-8 =head1 NAME DBIx::Class::ResultSet::RecursiveUpdate - like update_or_create - but recursive =head1 VERSION version 0.40 =head1 SYNOPSIS # The functional interface: my $schema = MyDB::Schema->connect(); my $new_item = DBIx::Class::ResultSet::RecursiveUpdate::Functions::recursive_update( resultset => $schema->resultset('User'), updates => { id => 1, owned_dvds => [ { title => "One Flew Over the Cuckoo's Nest" } ] }, unknown_params_ok => 1, ); # As ResultSet subclass: __PACKAGE__->load_namespaces( default_resultset_class => '+DBIx::Class::ResultSet::RecursiveUpdate' ); # in the Schema file (see t/lib/DBSchema.pm). Or appropriate 'use base' in the ResultSet classes. my $user = $schema->resultset('User')->recursive_update({ id => 1, owned_dvds => [ { title => "One Flew Over the Cuckoo's Nest" } ] }, { unknown_params_ok => 1, }); # You'll get a warning if you pass non-result specific data to # recursive_update. See L # for more information how to prevent this. =head1 DESCRIPTION You can feed the ->create method of DBIx::Class with a recursive datastructure and have the related records created. Unfortunately you cannot do a similar thing with update_or_create. This module tries to fill that void until L has an api itself. The functional interface can be used without modifications of the model, for example by form processors like L. It is a base class for Ls providing the method recursive_update which works just like update_or_create but can recursively update or create result objects composed of multiple rows. All rows need to be identified by primary keys so you need to provide them in the update structure (unless they can be deduced from the parent row. For example a related row of a belongs_to relationship). If any of the primary key columns are missing, a new row will be created, with the expectation that the missing columns will be filled by it (as in the case of auto_increment primary keys). If the resultset itself stores an assignment for the primary key, like in the case of: my $restricted_rs = $user_rs->search( { id => 1 } ); you need to inform recursive_update about the additional predicate with the fixed_fields attribute: my $user = $restricted_rs->recursive_update( { owned_dvds => [ { title => 'One Flew Over the Cuckoo's Nest' } ] }, { fixed_fields => [ 'id' ], } ); For a many_to_many (pseudo) relation you can supply a list of primary keys from the other table and it will link the record at hand to those and only those records identified by them. This is convenient for handling web forms with check boxes (or a select field with multiple choice) that lets you update such (pseudo) relations. For a description how to set up base classes for ResultSets see L. =head2 Additional data in the updates hashref If you pass additional data to recursive_update which doesn't match a column name, column accessor, relationship or many-to-many helper accessor, it will throw a warning by default. To disable this behaviour you can set the unknown_params_ok attribute to a true value. The warning thrown is: "No such column, relationship, many-to-many helper accessor or generic accessor '$key'" When used by L this can happen if you have additional form fields that aren't relevant to the database but don't have the noupdate attribute set to a true value. NOTE: in a future version this behaviour will change and throw an exception instead of a warning! =head1 DESIGN CHOICES Columns and relationships which are excluded from the updates hashref aren't touched at all. =head2 Treatment of belongs_to relations In case the relationship is included but undefined in the updates hashref, all columns forming the relationship will be set to null. If not all of them are nullable, DBIx::Class will throw an error. Updating the relationship: my $dvd = $dvd_rs->recursive_update( { id => 1, owner => $user->id, }); Clearing the relationship (only works if cols are nullable!): my $dvd = $dvd_rs->recursive_update( { id => 1, owner => undef, }); Updating a relationship including its (full) primary key: my $dvd = $dvd_rs->recursive_update( { id => 1, owner => { id => 2, name => "George", }, }); =head2 Treatment of might_have relationships In case the relationship is included but undefined in the updates hashref, all columns forming the relationship will be set to null. Updating the relationship: my $user = $user_rs->recursive_update( { id => 1, address => { street => "101 Main Street", city => "Podunk", state => "New York", } }); Clearing the relationship: my $user = $user_rs->recursive_update( { id => 1, address => undef, }); =head2 Treatment of has_many relations If a relationship key is included in the data structure with a value of undef or an empty array, all existing related rows will be deleted, or their foreign key columns will be set to null. The exact behaviour depends on the nullability of the foreign key columns and the value of the "if_not_submitted" parameter. The parameter defaults to undefined which neither nullifies nor deletes. When the array contains elements they are updated if they exist, created when not and deleted if not included. =head3 All foreign table columns are nullable In this case recursive_update defaults to nullifying the foreign columns. =head3 Not all foreign table columns are nullable In this case recursive_update deletes the foreign rows. Updating the relationship: Passing ids: my $user = $user_rs->recursive_update( { id => 1, owned_dvds => [1, 2], }); Passing hashrefs: my $user = $user_rs->recursive_update( { id => 1, owned_dvds => [ { name => 'temp name 1', }, { name => 'temp name 2', }, ], }); Passing objects: my $user = $user_rs->recursive_update( { id => 1, owned_dvds => [ $dvd1, $dvd2 ], }); You can even mix them: my $user = $user_rs->recursive_update( { id => 1, owned_dvds => [ 1, { id => 2 } ], }); Clearing the relationship: my $user = $user_rs->recursive_update( { id => 1, owned_dvds => undef, }); This is the same as passing an empty array: my $user = $user_rs->recursive_update( { id => 1, owned_dvds => [], }); =head2 Treatment of many-to-many pseudo relations If a many-to-many accessor key is included in the data structure with a value of undef or an empty array, all existing related rows are unlinked. When the array contains elements they are updated if they exist, created when not and deleted if not included. RecursiveUpdate defaults to calling 'set_$rel' to update many-to-many relationships. See L for details. set_$rel effectively removes and re-adds all relationship data, even if the set of related items did not change at all. If L is in use, RecursiveUpdate will look up the corresponding has_many relationship and use this to recursively update the many-to-many relationship. While both mechanisms have the same final result, deleting and re-adding all relationship data can have unwanted consequences if triggers or method modifiers are defined or logging modules like L are in use. The traditional "set_$rel" behaviour can be forced by passing "m2m_force_set_rel => 1" to recursive_update. See L for many-to-many pseudo relationship detection. Updating the relationship: Passing ids: my $dvd = $dvd_rs->recursive_update( { id => 1, tags => [1, 2], }); Passing hashrefs: my $dvd = $dvd_rs->recursive_update( { id => 1, tags => [ { id => 1, file => 'file0' }, { id => 2, file => 'file1', }, ], }); Passing objects: my $dvd = $dvd_rs->recursive_update( { id => 1, tags => [ $tag1, $tag2 ], }); You can even mix them: my $dvd = $dvd_rs->recursive_update( { id => 1, tags => [ 2, { id => 3 } ], }); Clearing the relationship: my $dvd = $dvd_rs->recursive_update( { id => 1, tags => undef, }); This is the same as passing an empty array: my $dvd = $dvd_rs->recursive_update( { id => 1, tags => [], }); Make sure that set_$rel used to update many-to-many relationships even if IntrospectableM2M is loaded: my $dvd = $dvd_rs->recursive_update( { id => 1, tags => [1, 2], }, { m2m_force_set_rel => 1 }, ); =head1 INTERFACE =head1 METHODS =head2 recursive_update The method that does the work here. =head2 is_m2m =over 4 =item Arguments: $name =item Return Value: true, if $name is a many to many pseudo-relationship =back The function gets the information about m2m relations from L. If it isn't loaded in the ResultSource class, the code relies on the fact: if($object->can($name) and !$object->result_source->has_relationship($name) and $object->can( 'set_' . $name ) ) to identify a many to many pseudo relationship. In a similar ugly way the ResultSource of that many to many pseudo relationship is detected. So if you need many to many pseudo relationship support, it's strongly recommended to load L in your ResultSource class! =head2 get_m2m_source =over 4 =item Arguments: $name =item Return Value: $result_source =back =head1 CONFIGURATION AND ENVIRONMENT DBIx::Class::RecursiveUpdate requires no configuration files or environment variables. =head1 DEPENDENCIES DBIx::Class optional but recommended: DBIx::Class::IntrospectableM2M =head1 INCOMPATIBILITIES None reported. =head1 BUGS AND LIMITATIONS The list of reported bugs can be viewed at L. Please report any bugs or feature requests to C, or through the web interface at L. =head1 AUTHORS =over 4 =item * Zbigniew Lukasiak =item * John Napiorkowski =item * Alexander Hartmaier =item * Gerda Shank =back =head1 COPYRIGHT AND LICENSE This software is copyright (c) 2019 by Zbigniew Lukasiak, John Napiorkowski, Alexander Hartmaier. This is free software; you can redistribute it and/or modify it under the same terms as the Perl 5 programming language system itself. =cut