Thanks! Things are finally a bit better. You can read about some of the gymnastics we've had to do to be able to continue our daily lives in the information age :)
https://medium.com/@siloraptor/solar-powered-microservers-for-a-post-hurricane-maria-puerto-rico-ca83027d20ac Electric service is more stable now (one blackout every week or two weeks), but I still continue modifying hardware to run on DC off of the solar system as a precaution. Yes, Eric and Michael have been doing a ton of work adding new features and even code from their commercial version (they have one install that gets 10,000 new documents per day). Their experience scaling Mayan has been very valuable. For the database conversion we are making use natural keys which use a combination of fields instead of a sequencial number to guarantee that a dumped data from a model is reconstructed exactly the same in a new database. We also added an experimental "convertdb" console command that handles almost all of the process automatically. The main problem we have right now is with the package django-celery, a third party package. It doesn't support natural keys. We submitted a patch (https://github.com/celery/django-celery/pull/552) but the last release was in November 2017. If there is no new release by the time we plan to make our we going to have to come up with another solution (monkeypatch, release our fork as a PyPI package or other). We are giving the database conversion issue top priority and have an almost complete solution in place for the next release. We are still dependant on a new release of django-celery and on users that have allowed us to use their database so we can test against real data. Our patches have a high success ratio and this will continue improving on subsequent releases. On Saturday, July 28, 2018 at 6:16:14 PM UTC-4, Douglas Van Es wrote: > > > first, great work merging NG and mayan guys! i really appreciate the huge > amount of work you have done, and it is amazing that roberto has been > able to continue development in light of the destruction the hurricane > brought to his country. and the development done by the ng team and those > that have joined roberto on a more permanent basis is fantastic as well! > > now. how do i dump and import my sqlite db? > > i see the earlier posts on the matter, and it seems i will (or may) need > to serialize the data before i can import. but this process is incomlete? > with a sqlite db with only some 16000 documents, will i really have > duplicate #'s? > > i am unsure if the examples others have posted regarding upgrading are > appropriate to my install, as it seems nobody who has posted is running > mayan in the same way i do. > > i simply run mayan in a docker container, per the documentation! lol. > > i have the sqlite db data in a persistent docker volume. i created the > volume like so: > > # docker volume create --name mayan_media --opt type=none --opt > device=/[PATH_TO_PERSISTENT_DATA] --opt o=bind > > i initialize the container: > > # docker run --rm -v mayan_media:/var/lib/mayan -v mayan_settings:/etc/ > mayan mayanedms/mayanedms mayan:init > > then when i start mayan i do it like this: > > # docker run -d --name mayan-edms --restart=always -p 80:80 -v > mayan_data:/var/lib/mayan -v /[PATH]/invwatch:/[PATH]/invwatch -v /[PATH]/ > invstaging:/[PATH]/invstaging mayanedms/mayanedms:2.7.3 > > what will i have to do to migrate my sqlite db to postgresql? and is that > the simplest/recommended way, or should i use mysql? > > thanks in advance! > > -- --- You received this message because you are subscribed to the Google Groups "Mayan EDMS" group. To unsubscribe from this group and stop receiving emails from it, send an email to mayan-edms+unsubscr...@googlegroups.com. For more options, visit https://groups.google.com/d/optout.