Well I think my question is somewhat away from my intention cause of my
poor understanding and questioning :(



Actually, I have 1TB data and have hardware spec enough to handle this
amount of data, but the problem is that it needs too many join operations
and the analysis process is going too slow right now.



I've searched and found that graph model nicely fits for network data like
social data in query performance.



Should I change my DB (I mean my DB for analysis)? or do I need some other
solutions or any extension?


Thanks

On Thu, Jun 14, 2018 at 3:36 PM, Melvin Davidson <melvin6...@gmail.com>
wrote:

>
>
> On Thu, Jun 14, 2018 at 6:30 PM, Adrian Klaver <adrian.kla...@aklaver.com>
> wrote:
>
>> On 06/14/2018 02:33 PM, Data Ace wrote:
>>
>>> Hi, I'm new to the community.
>>>
>>> Recently, I've been involved in a project that develops a social network
>>> data analysis service (and my client's DBMS is based on PostgreSQL).
>>> I need to gather huge volume of unstructured raw data for this project,
>>> and the problem is that with PostgreSQL, it would be so dfficult to handle
>>> this kind of data. Are there any PG extension modules or methods that are
>>> recommended for my project?
>>>
>>
>> In addition to Ravi's questions:
>>
>> What does the data look like?
>>
>> What Postgres version?
>>
>> How is the data going to get from A <--> B, local or remotely or both?
>>
>> Is there another database or program involved in the process?
>>
>>
>>> Thanks in advance.
>>>
>>
>>
>> --
>> Adrian Klaver
>> adrian.kla...@aklaver.com
>>
>>
> In addition to Ravi's and Adrian's questions:
>
> What is the hardware configuration?
>
> --
> *Melvin Davidson*
> *Maj. Database & Exploration Specialist*
> *Universe Exploration Command – UXC*
> Employment by invitation only!
>

Reply via email to