Hi Gary,

I'm going to start it all over again this weekend, when having time. I 
organized all logdata from each year into a separate directory.
When importing it, I just edit the import path in the configfile and then 
start wee_import with --import-config=myimportconfig.con --verbose, and the 
let it run....

Here is some 6 years of data, some newer and some older. You can play 
around with it, if you like.
https://drive.google.com/drive/folders/1zwvlJSlGc0gtypuBaFIYq0V-jCg1e776?usp=sharing

I can just upload the whole bunch of it, if you want.
Thank you for your help!!!

Peter

gjr80 schrieb am Montag, 15. Februar 2021 um 21:48:22 UTC+1:

> Hi Peter,
>
> Before you start it off again can you describe how you are doing the 
> import; ie time period per import,  one big import, multiple smaller 
> periods? How is you data organised on disc, one large directory with all 
> file, year by year directories etc.
>
> Gary
>
> On Tuesday, 16 February 2021 at 00:00:27 UTC+10 peter.su...@gmail.com 
> wrote:
>
>> Hi Gary,
>>
>> today my Raspberry 4 arrived, which will hopefully be my long-running 
>> weather computer.
>> I tried to import all my data on Pi3. It worked back for about 8 years, 
>> but then the performance crashed into hell.
>> The import of one year took about 1 week and was getting worse.
>>
>> Now I'm starting it all over on a fresh install and report, what will be 
>> the result.
>>
>> Peter
>> gjr80 schrieb am Montag, 1. Februar 2021 um 23:28:04 UTC+1:
>>
>>> Hi Peter,
>>>
>>> On Tuesday, 2 February 2021 at 06:03:36 UTC+10 peter.su...@gmail.com 
>>> wrote:
>>>
>>>> Hey Gary,
>>>> its looking perfect. I just separated the logs for each year and right 
>>>> now I am testing the 6th year in a row, doing a quick dry-run.
>>>> The script is finding some duplicates, but it looks like every complete 
>>>> year from 01/01 to 12/31 should be imported into the database.
>>>>
>>>
>>> That is interesting, the screen output you posted originally clearly 
>>> showed an error decoding a date-time, I am surprised that re-arranging the 
>>> log files fixed that. The first person that trialled the WD import module 
>>> was in a similar boat to you; they had many years of WD data and when they 
>>> first tried to import the data they did it in one run and not only did it 
>>> take many hours but wee_import eventually ground to a halt and crashed 
>>> (the exact details escape me). So importing/processing a year at a time may 
>>> be more manageable and safer.
>>>
>>> The duplicates issue is not uncommon, the few suites of WD logs I have 
>>> worked on all contained the odd duplicate date-time. wee_import will 
>>> simply ignore the duplicates, the  only issue is when the data for each 
>>> duplicate is different. Probably not going to make too much difference over 
>>> 15 years of data at one minute intervals.
>>>
>>> One more question from my side: Would you prefer a MySQL database with 
>>>> all this mass of data?
>>>>
>>>  
>>> I think the consensus is that unless you have good reason to use 
>>> MySQL/MariaDB stick with SQLite. SQLite is faster and simpler to 
>>> backup/restore. SQLite will easy handle all of your 15 years of data.
>>>  
>>> Gary
>>>
>>

-- 
You received this message because you are subscribed to the Google Groups 
"weewx-user" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to weewx-user+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/weewx-user/6c84f279-cf8b-40df-a84f-7bb6032867bfn%40googlegroups.com.

Reply via email to