Re: [Catalyst] RFC: Catalyst::Controller::REST::DBIC
Hi there, This is a frequently recurring conversation - so I created a wiki page to gather all the points where we reached some consensus: http://catwiki.toeat.com/crud. For the start I just dumped my opinions. I tried to be not controversial - but it is a wiki - if you don't agree then you can edit it and make it more acceptable for you. I am especially waiting for people with opinions on the REST and browser REST part - I have got much knowledge in that area. Cheers, Zbigniew On Sun, May 4, 2008 at 2:38 AM, luke saunders [EMAIL PROTECTED] wrote: I have started to write a Catalyst base controller for REST style CRUD via DBIC. I have noticed that a number of other people have been working on or are thinking about working on something similar, most notabley J. Shirley who seems to be creating Catalyst::Controller::REST::DBIC::Item (http://dev.catalystframework.org/svnweb/Catalyst/browse/Catalyst-Controller-REST-DBIC-Item/) and some chaps from a recent thread on this list (entitled Dispatching with Chained vs HTTP method). Ideally I would like to merge J. Shirley's effort into mine (or visa versa) along with anything that anyone else has. Basically I want to avoid ending up with a load of modules that all do the same thing. My effort is heavily based on something mst wrote a while ago, and since then I've ended up writing something very similar for every project I've worked on which indicates it's worth OSing. Essentially it is used like so: package MyApp::Controller::API::REST::CD; use base qw/Catalyst::Controller::REST::DBIC/; ... __PACKAGE__-config ( action = { setup = { PathPart = 'cd', Chained = '/api/rest/rest_base' } }, class = 'RestTestDB::CD', create_requires = ['artist', 'title', 'year' ], update_allows = ['title', 'year'] ); And this gets you the following endpoints to fire requests at: /api/rest/cd/create /api/rest/cd/id/[cdid]/update /api/rest/cd/id/[cdid]/delete /api/rest/cd/id/[cdid]/add_to_rel/[relation] /api/rest/cd/id/[cdid]/remove_from_rel/[relation] The full source is here: http://lukesaunders.me.uk/dists/Catalyst-Controller-REST-DBIC-1.00.tar.gz If you have a few moments please have a look, especially if you are working on something similar. Today I even wrote a test suite which has a test app and is probably the best place to look to see what it does. Note that it lacks: - list and view type methods which dump objects to JSON (or whatever) - clever validation - it should validate based on the DBIC column definitions but it doesn't - any auth - not sure if it should or not, but it's possible Also it doesn't distinguish between POST, PUT, DELETE and GET HTTP requests favouring instead entirely separate endpoints, but that's up for discussion. So, J. Shirley, do you have any interest in a merge? And others, do you have ideas and would you like to contribute? Thanks, Luke. ___ List: Catalyst@lists.scsys.co.uk Listinfo: http://lists.scsys.co.uk/cgi-bin/mailman/listinfo/catalyst Searchable archive: http://www.mail-archive.com/catalyst@lists.scsys.co.uk/ Dev site: http://dev.catalyst.perl.org/ -- Zbigniew Lukasiak http://brudnopis.blogspot.com/ ___ List: Catalyst@lists.scsys.co.uk Listinfo: http://lists.scsys.co.uk/cgi-bin/mailman/listinfo/catalyst Searchable archive: http://www.mail-archive.com/catalyst@lists.scsys.co.uk/ Dev site: http://dev.catalyst.perl.org/
Re: [Catalyst] Anybody who fancies some LWP poking ...
Hi! On Sun, May 11, 2008 at 07:10:27PM +0100, Leon Brocard wrote: 2008/5/10 Daniel McBrearty [EMAIL PROTECTED]: I'd like Leon's opinion on this. Forwarding to him again. I understand many bits of it but have given up trying to get it working. I would love a patch which passes tests on both old and new lib-www-perls. Today we where hit by this, and I dug into the code... The problems seems to lie in WWW::Mechanize and Test::WWW::Mechanize::Catalyst. I was able to (sort of) fix it with the attached two patches. T:W:M:C tests work after those patches (as do our tests...), but WWW::Mechanize spews some Parsing of undecoded UTF-8 will give garbage when decoding entitie warnings. And I'm not in the mood for utf8 debugging ... I have to say that I did not analyse the whole problem, and in fact we just downgrade to libwww-perl-5.808. But if these findings help someone with deeper knowledge to really solve the problem, I'd be delighted! -- #!/usr/bin/perl http://domm.plix.at for(ref bless{},just'another'perl'hacker){s-:+-$-gprint$_.$/} diff -ur WWW-Mechanize-1.34/lib/WWW/Mechanize.pm WWW-Mechanize-1.34-patched/lib/WWW/Mechanize.pm --- WWW-Mechanize-1.34/lib/WWW/Mechanize.pm 2007-12-10 07:31:20.0 +0100 +++ WWW-Mechanize-1.34-patched/lib/WWW/Mechanize.pm 2008-05-15 13:28:03.0 +0200 @@ -2148,7 +2148,7 @@ # See docs in HTTP::Message for details. Do we need to expose the options there? # use charset = 'none' because while we want LWP to handle Content-Encoding for # the auto-gzipping with Compress::Zlib we don't want it messing with charset -my $content = $res-decoded_content( charset = 'none' ); +my $content = $res-decoded_content(); $content = $res-content if (not defined $content); $content .= _taintedness(); Only in WWW-Mechanize-1.34-patched/: Makefile.old diff -ur WWW-Mechanize-1.34/t/local/log-server WWW-Mechanize-1.34-patched/t/local/log-server --- WWW-Mechanize-1.34/t/local/log-server 2007-08-27 02:47:30.0 +0200 +++ WWW-Mechanize-1.34-patched/t/local/log-server 2008-05-15 13:14:10.0 +0200 @@ -88,7 +88,7 @@ a href=/foo1.save_log_server_test.tmpLink foo1.save_log_server_test.tmp/a a href=/foo2.save_log_server_test.tmpLink foo2.save_log_server_test.tmp/a a href=/foo3.save_log_server_test.tmpLink foo3.save_log_server_test.tmp/a - a href=/o-umlautLöschen -- testing for o-umlaut./a + a href=/o-umlautLöschen -- testing for o-umlaut./a a href=/o-umlaut-encodedStouml;sberg -- testing for encoded o-umlaut./a table diff -ru Test-WWW-Mechanize-Catalyst-0.42/lib/Test/WWW/Mechanize/Catalyst.pm Test-WWW-Mechanize-Catalyst-0.42-patched/lib/Test/WWW/Mechanize/Catalyst.pm --- Test-WWW-Mechanize-Catalyst-0.42/lib/Test/WWW/Mechanize/Catalyst.pm 2008-04-29 21:25:55.0 +0200 +++ Test-WWW-Mechanize-Catalyst-0.42-patched/lib/Test/WWW/Mechanize/Catalyst.pm 2008-05-15 13:30:44.0 +0200 @@ -89,7 +89,6 @@ # For some reason Test::WWW::Mechanize uses $response-content everywhere # instead of $response-decoded_content; -$response-content( $response-decoded_content ); } return $response; ___ List: Catalyst@lists.scsys.co.uk Listinfo: http://lists.scsys.co.uk/cgi-bin/mailman/listinfo/catalyst Searchable archive: http://www.mail-archive.com/catalyst@lists.scsys.co.uk/ Dev site: http://dev.catalyst.perl.org/
Re: [Catalyst] RFC: Catalyst::Controller::REST::DBIC
On Thu, May 15, 2008 at 7:31 PM, Mark Trostler [EMAIL PROTECTED] wrote: You don't need 'create' 'update' 'delete' parts of your URL - those should be denoted by the request type - POST, PUT, or DELETE right? Yes - you are right about REST, but what something more than that. We want to have is a REST interface together with something REST-like that will work for browsers. Similarly you don't need 'id' in the url - so POST to /api/rest/cd will create a cd. A PUT to /api/rest/cd/5 will update that CD - a DELETE to /api/rest/cd/5 will delete that CD... Additionally we would like to have other non REST actions in the same controller. This mixing will require some separation between the method names and the object id (which is data). This is why I propose /cd/instance/5 for the retrieve action. -- Zbigniew ___ List: Catalyst@lists.scsys.co.uk Listinfo: http://lists.scsys.co.uk/cgi-bin/mailman/listinfo/catalyst Searchable archive: http://www.mail-archive.com/catalyst@lists.scsys.co.uk/ Dev site: http://dev.catalyst.perl.org/
[Catalyst] Re: RFC: Catalyst::Controller::REST::DBIC
* Zbigniew Lukasiak [EMAIL PROTECTED] [2008-05-15 21:25]: On Thu, May 15, 2008 at 7:31 PM, Mark Trostler [EMAIL PROTECTED] wrote: Similarly you don't need 'id' in the url - so POST to /api/rest/cd will create a cd. A PUT to /api/rest/cd/5 will update that CD - a DELETE to /api/rest/cd/5 will delete that CD... Additionally we would like to have other non REST actions in the same controller. This mixing will require some separation between the method names and the object id (which is data). You are thinking about it wrong. You don’t put method names into the URI; you expose more of the state as things to manipulate. If you give an example of what you mean I can give you one of what I mean. Regards, -- Aristotle Pagaltzis // http://plasmasturm.org/ ___ List: Catalyst@lists.scsys.co.uk Listinfo: http://lists.scsys.co.uk/cgi-bin/mailman/listinfo/catalyst Searchable archive: http://www.mail-archive.com/catalyst@lists.scsys.co.uk/ Dev site: http://dev.catalyst.perl.org/
Re: [Catalyst] uri_for() doesn't encode to utf8 first argument
On Thu, May 15, 2008 at 11:50 PM, Dmitriy S. Sinyavskiy [EMAIL PROTECTED] wrote: So what about my patch and test? Is it right or someone can correct it? The silence is here for a week or more ( Jonathan Rockway asked you to regenerate without whitespace changes. Just waiting on that from you to properly review it. -J ___ List: Catalyst@lists.scsys.co.uk Listinfo: http://lists.scsys.co.uk/cgi-bin/mailman/listinfo/catalyst Searchable archive: http://www.mail-archive.com/catalyst@lists.scsys.co.uk/ Dev site: http://dev.catalyst.perl.org/
Re: [Catalyst] Output as XML
On Thu, May 15, 2008 at 10:49 PM, Mitch Jackson [EMAIL PROTECTED] wrote: Russell, Thanks for the suggestion. I looked at that, however it basically does what I'm already doing. The bottleneck wasn't so much TT, but the creation of thousands of DBIC objects and sticking them into an array. The same would need to be done with C::V::Rest::XML, as it serializes the stash. I needed an approach that generated the XML while walking the query results, rather than caching them all into memory first. Here's what I ended up doing. It needs more work to support joins or complex queries, but the speed difference is insane. Here's benchmark results between pulling 100, 1000, 5000 and 15000 table rows using the old way and the following function. As you can see, sending a DBIC array of 15,000 rows to TT took 228 seconds to render :-( This xml() method took 1.65 seconds. $ perl xmlbench.pl Rate obj 100 xml 100 obj 100 3.53/s ---94% xml 100 62.5/s 1669% -- s/iter obj 1000 xml 1000 obj 1000 3.38 -- -97% xml 1000 0.1122932% -- s/iter obj 5000 xml 5000 obj 5000 32.3 -- -98% xml 5000 0.5495779% -- s/iter obj 15000 xml 15000 obj 15000228-- -99% xml 15000 1.6513753% ## include in the schema class # Use in place of -search to return an XML document containing the # records for the query # # my $xml = $schema-xml('table',{field = 'value'},{rows = 20}); sub xml { my ( $self, $model, @search_params ) = @_; croak 'xml( $model, @params ) requires a model parameter' unless defined $model and $model; my %xml_escape_map = ( '' = 'lt;', '' = 'gt;', '' = 'quot;', '' = 'amp;', ); # Prepare the query my $rs = $self-resultset($model)-search(@search_params); croak xml() unable to prepare query unless defined $rs; # Begin the XML document my $xml = '?xml versiono=1.0 encoding=utf-8 ?'.\n . total_records$rs/total_records\n . 'records'.\n; # Add an xml block for each record in the set my @cols = $self-resultset($model)-result_source-columns; my $cursor = $rs-cursor; while ( my @rec = $cursor-next ) { $xml .= 'record'.\n; for my $f ( @cols ) { my $v = shift @rec; $v =~ s/([])/$xml_escape_map{$1}/g; $xml .= ${f}$v/${f}\n; } $xml .= '/record'.\n; } # Terminate the xml $xml .= '/records'.\n; return $xml; } /Mitchell K. Jackson On Wed, May 14, 2008 at 10:10 PM, Russell Jurney [EMAIL PROTECTED] wrote: Have you thought about using this: http://search.cpan.org/~sri/Catalyst-View-REST-XML-0.01/XML.pm with raw data to achieve the desired speed? Not sure where your bottleneck is, but if TT is a problem then I assume XML::Simple is faster than TT to serialize XML? Russell Jurney [EMAIL PROTECTED] On May 14, 2008, at 10:02 AM, Mitch Jackson wrote: Good morning! I'm about to start working on some DBIC query to XML code, but before I do I was wondering if anybody out there has already done this, or if perhaps my approach is thick-headed. I'm generating XML from database queries in a catalyst app. At the moment, I am doing it a bit like this (simplified for readability): - # controller.pm /controller/action/parm1/parm2/parm3/something.xml sub action : Local { ... $c-stash-{records} = [ $c-model('table')-search( {}, { rows = 20, page 2 } ) ]; $c-res-content_type('text/xml'); $c-.res-header('Content-disposition' = 'attachment; filename=action_${timestamp}.xml'); $c-res-template('xml/action.xml'); } # xml/action.xml ?xml version=1.0 encoding=utf-8 ? records [% FOREACH record IN records -%] record id=[% record.id %] field1[% record.field1 %]/field1 field2[% record.field2 %]/field2 field3[% record.field3 %]/field3 /record [% END # foreach record -%] /records --- This approach works fine for paged record sets ( that get loaded into an ExtJS ajax grid ). When I use this on a record set of 15k-16k records, the app goes to 100% CPU and cannot complete the request after several minutes. There is a lot of overhead to generate 16k DBIC objects, dump them in an array, and then manipulate them through TT. This speed problem is unacceptable for my app, especially considering my users may be dealing with much larger datasets than this. One solution would be to write something proprietary to this implementation as a module that would throw away the overhead bloat and generate the XML file efficiently... but I want something reusable in the long term from a catalyst perespective. I am considering writing some sort of DBIC query to XML code that would use the DBI cursor directly to
Re: [Catalyst] Output as XML
J, My solution was better suited for the DBIx::Class list I suppose, but I posed the question here to see if there was already some sort of Catalyst solution I had overlooked... a view for example. I took a look at that part of the cookbook before, but it seems to only apply if you're pulling one table row, not a record set. Thanks for the advice... /Mitch On Thu, May 15, 2008 at 9:36 PM, J. Shirley [EMAIL PROTECTED] wrote: On Thu, May 15, 2008 at 10:49 PM, Mitch Jackson [EMAIL PROTECTED] wrote: Russell, Thanks for the suggestion. I looked at that, however it basically does what I'm already doing. The bottleneck wasn't so much TT, but the creation of thousands of DBIC objects and sticking them into an array. The same would need to be done with C::V::Rest::XML, as it serializes the stash. I needed an approach that generated the XML while walking the query results, rather than caching them all into memory first. Here's what I ended up doing. It needs more work to support joins or complex queries, but the speed difference is insane. Here's benchmark results between pulling 100, 1000, 5000 and 15000 table rows using the old way and the following function. As you can see, sending a DBIC array of 15,000 rows to TT took 228 seconds to render :-( This xml() method took 1.65 seconds. $ perl xmlbench.pl Rate obj 100 xml 100 obj 100 3.53/s ---94% xml 100 62.5/s 1669% -- s/iter obj 1000 xml 1000 obj 1000 3.38 -- -97% xml 1000 0.1122932% -- s/iter obj 5000 xml 5000 obj 5000 32.3 -- -98% xml 5000 0.5495779% -- s/iter obj 15000 xml 15000 obj 15000228-- -99% xml 15000 1.6513753% ## include in the schema class # Use in place of -search to return an XML document containing the # records for the query # # my $xml = $schema-xml('table',{field = 'value'},{rows = 20}); sub xml { my ( $self, $model, @search_params ) = @_; croak 'xml( $model, @params ) requires a model parameter' unless defined $model and $model; my %xml_escape_map = ( '' = 'lt;', '' = 'gt;', '' = 'quot;', '' = 'amp;', ); # Prepare the query my $rs = $self-resultset($model)-search(@search_params); croak xml() unable to prepare query unless defined $rs; # Begin the XML document my $xml = '?xml versiono=1.0 encoding=utf-8 ?'.\n . total_records$rs/total_records\n . 'records'.\n; # Add an xml block for each record in the set my @cols = $self-resultset($model)-result_source-columns; my $cursor = $rs-cursor; while ( my @rec = $cursor-next ) { $xml .= 'record'.\n; for my $f ( @cols ) { my $v = shift @rec; $v =~ s/([])/$xml_escape_map{$1}/g; $xml .= ${f}$v/${f}\n; } $xml .= '/record'.\n; } # Terminate the xml $xml .= '/records'.\n; return $xml; } /Mitchell K. Jackson On Wed, May 14, 2008 at 10:10 PM, Russell Jurney [EMAIL PROTECTED] wrote: Have you thought about using this: http://search.cpan.org/~sri/Catalyst-View-REST-XML-0.01/XML.pm with raw data to achieve the desired speed? Not sure where your bottleneck is, but if TT is a problem then I assume XML::Simple is faster than TT to serialize XML? Russell Jurney [EMAIL PROTECTED] On May 14, 2008, at 10:02 AM, Mitch Jackson wrote: Good morning! I'm about to start working on some DBIC query to XML code, but before I do I was wondering if anybody out there has already done this, or if perhaps my approach is thick-headed. I'm generating XML from database queries in a catalyst app. At the moment, I am doing it a bit like this (simplified for readability): - # controller.pm /controller/action/parm1/parm2/parm3/something.xml sub action : Local { ... $c-stash-{records} = [ $c-model('table')-search( {}, { rows = 20, page 2 } ) ]; $c-res-content_type('text/xml'); $c-.res-header('Content-disposition' = 'attachment; filename=action_${timestamp}.xml'); $c-res-template('xml/action.xml'); } # xml/action.xml ?xml version=1.0 encoding=utf-8 ? records [% FOREACH record IN records -%] record id=[% record.id %] field1[% record.field1 %]/field1 field2[% record.field2 %]/field2 field3[% record.field3 %]/field3 /record [% END # foreach record -%] /records --- This approach works fine for paged record sets ( that get loaded into an ExtJS ajax grid ). When I use this on a record set of 15k-16k records, the app goes to 100% CPU and cannot complete the request after several minutes. There is a lot of overhead to generate 16k DBIC objects, dump them in an array, and then manipulate them through TT. This speed problem is unacceptable for my app,
Re: [Catalyst] Re: RFC: Catalyst::Controller::REST::DBIC
On Thu, May 15, 2008 at 11:11 PM, Aristotle Pagaltzis [EMAIL PROTECTED] wrote: * Zbigniew Lukasiak [EMAIL PROTECTED] [2008-05-15 21:25]: On Thu, May 15, 2008 at 7:31 PM, Mark Trostler [EMAIL PROTECTED] wrote: Similarly you don't need 'id' in the url - so POST to /api/rest/cd will create a cd. A PUT to /api/rest/cd/5 will update that CD - a DELETE to /api/rest/cd/5 will delete that CD... Additionally we would like to have other non REST actions in the same controller. This mixing will require some separation between the method names and the object id (which is data). You are thinking about it wrong. You don't put method names into the URI; you expose more of the state as things to manipulate. If you give an example of what you mean I can give you one of what I mean. Hmm - frankly I have never thought out REST entirely - but I have the feeling that it is always better to be cautious (and you know - be liberal in what you receive). Some ideas top of my head: - a login method (the advantage would be that it does not need to redirect) - a search - exposing some class data (as opposed to instance data) and maybe class methods? Cheers, Zbigniew Lukasiak http://brudnopis.blogspot.com/ ___ List: Catalyst@lists.scsys.co.uk Listinfo: http://lists.scsys.co.uk/cgi-bin/mailman/listinfo/catalyst Searchable archive: http://www.mail-archive.com/catalyst@lists.scsys.co.uk/ Dev site: http://dev.catalyst.perl.org/