This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author jondo
Recipients berker.peksag, ghaering, jondo, matrixise, p-ganssle
Date 2018-11-06.09:43:06
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1541497386.23.0.788709270274.issue35145@psf.upfronthosting.co.za>
In-reply-to
Content
Here is now a reduced version of my real use case: 2017v2-reduced.db. The file was originally created with "Oracle DBConvert". This file contains a table without rows.
I was wrong in my last comment: There's also no sniffing needed here, see either of the following useful outputs:
```
In [5]: pd.read_sql_query('PRAGMA table_info("t2")', con)['type']
Out[5]: 
0       DATETIME
1    VARCHAR (3)
Name: type, dtype: object

In [6]: pd.read_sql_query('SELECT SQL FROM sqlite_master WHERE name = "t2"', con).iloc[0,0]
Out[6]: 'CREATE TABLE t2 (localtime DATETIME DEFAULT NULL, freq VARCHAR (3))'
```
This also works with the original file from "Oracle DBConvert".

Wouldn't it make sense for sqlite3 to use this information, e.g. when connect() is called with something like `detect_types=CONVERT_DATETIME`? One could even call the option CONVERT_ORACLE_DATETIME to describe that one cannot expect it to work with files from other sources.

If yes, then I suggest changing this issue's title to '"SELECT *" should optionally autoconvert DATETIME fields if found in DDL'.

The question is of course how much sqlite generating software creates this metadata. But Oracle is certainly an important actor. And sqlite3 itself could also save this DDL instead of extending the field names (which seems more hacky).
History
Date User Action Args
2018-11-06 09:43:06jondosetrecipients: + jondo, ghaering, berker.peksag, matrixise, p-ganssle
2018-11-06 09:43:06jondosetmessageid: <1541497386.23.0.788709270274.issue35145@psf.upfronthosting.co.za>
2018-11-06 09:43:06jondolinkissue35145 messages
2018-11-06 09:43:06jondocreate