On Friday, May 13, 2011 8:15:27 am John Fabiani wrote:
> On Friday, May 13, 2011 08:08:19 am Adrian Klaver wrote:
> > > On the surface that sounds good but I could not figure out how to do it
> > > in a dynamic way.  I ran into the same problem Dabo has.  How can I
> > > determine the datafields and provide the fake data for the first
> > > record.
> > 
> > You say you are pulling the data from Postgres.  psycopg2 captures the
> > fields and  type info in cursor.description. As to fake data:
> > 
> > select id::int, '1901-01-01::date', 'test'::varchar;
> > 
> > Use to create table/dataset and then insert rest of data into dataset?
> > 
> > Got to run, will ponder further.
> 
> COOL!!!  Never considered using the Postgres info.  Not completely sure how
> at the moment but I do know a DataSource description is available for each
> of the database engines Dabo supports.
> 
> Hmmmm  I'll have to think about this.
> 
> Johnf

Looking into this further I found:

dDataSet.py use the below for its connection.
self._connection = sqlite.connect(":memory:",
          detect_types=(sqlite.PARSE_DECLTYPES|sqlite.PARSE_COLNAMES),
          isolation_level="EXCLUSIVE")

>From here:
http://docs.pysqlite.googlecode.com/hg/sqlite3.html#default-adapters-and-
converters


It should be possible to do:

select date_field "[date]" from dataset 

and get the correct data type.

-- 
Adrian Klaver
adrian.kla...@gmail.com
_______________________________________________
Post Messages to: Dabo-users@leafe.com
Subscription Maintenance: http://leafe.com/mailman/listinfo/dabo-users
Searchable Archives: http://leafe.com/archives/search/dabo-users
This message: 
http://leafe.com/archives/byMID/201105131206.58421.adrian.kla...@gmail.com

Reply via email to