Hi,
Here is patch for initial implementation of on demand loading of result set
for query tool and datagrid.
--
*Harshal Dhumal*
*Sr. Software Engineer*
EnterpriseDB India: http://www.enterprisedb.com
The Enterprise PostgreSQL Company
On Tue, Apr 25, 2017 at 5:21 PM, Dave Page <[email protected]> wrote:
> Hi
>
> On Tue, Apr 25, 2017 at 8:41 AM, Harshal Dhumal <
> [email protected]> wrote:
>
>> Hi Dave,
>>
>> To implement feature #2137 <https://redmine.postgresql.org/issues/2137> we'll
>> need to use server cursor. However server cursor has some
>> limitation.
>> For eg.
>> 1. It cannot execute BEGIN; query (basically user cannot start new
>> database transaction)
>> 2. In case if Auto commit is true then we try to execute user queries
>> inside BEGIN and END when ever it's possible even though user has not put
>> BEGIN and END in his query.
>>
>> Also not all queries executed using Query tool produces records as
>> result. So can we assume only
>> queries started with SELECT should be executed using server cursor to
>> support on demand loading.
>> Or should we give user an option to use on demand loading like we have
>> options for Auto commit? and Auto rollback?
>> In case of on demand loading option user will be responsible to execute
>> correct queries (queries which can be executed using server cursor)
>>
>> Let me know your opinion on this.
>>
>
> Hmm, those are good points.
>
> So, as a first step, there's no absolute requirement to use a server side
> cursor here. The results can be materialised in libpq/psycopg2 (perhaps
> using an async query), then transferred to the client in batches as
> described in the ticket.
>
> I think this would be a significant improvemet - we can re-visit the
> possibility of using server side cursors in the future when we have more
> ability to parse the query string before executing it (something we will
> want to do when we merge query tool/edit grid functionality).
>
> --
> Dave Page
> Blog: http://pgsnake.blogspot.com
> Twitter: @pgsnake
>
> EnterpriseDB UK: http://www.enterprisedb.com
> The Enterprise PostgreSQL Company
>
diff --git a/web/config.py b/web/config.py
index 5ff9748..f89dc89 100644
--- a/web/config.py
+++ b/web/config.py
@@ -324,6 +324,12 @@ THREADED_MODE = True
SQLALCHEMY_TRACK_MODIFICATIONS = False
##########################################################################
+# Number of records to fetch in one batch in query tool when query result
+# set is large.
+##########################################################################
+ON_DEMAND_RECORD_COUNT = 1000
+
+##########################################################################
# Local config settings
##########################################################################
diff --git a/web/pgadmin/static/js/selection/copy_data.js b/web/pgadmin/static/js/selection/copy_data.js
index 018efea..c549563 100644
--- a/web/pgadmin/static/js/selection/copy_data.js
+++ b/web/pgadmin/static/js/selection/copy_data.js
@@ -11,13 +11,13 @@ define([
var grid = self.slickgrid;
var columnDefinitions = grid.getColumns();
var selectedRanges = grid.getSelectionModel().getSelectedRanges();
- var data = grid.getData();
+ var dataView = grid.getData();
var rows = grid.getSelectedRows();
if (allTheRangesAreFullRows(selectedRanges, columnDefinitions)) {
self.copied_rows = rows.map(function (rowIndex) {
- return data[rowIndex];
+ return grid.getDataItem(rowIndex);
});
setPasteRowButtonEnablement(self.can_edit, true);
} else {
@@ -25,7 +25,7 @@ define([
setPasteRowButtonEnablement(self.can_edit, false);
}
- var csvText = rangeBoundaryNavigator.rangesToCsv(data, columnDefinitions, selectedRanges);
+ var csvText = rangeBoundaryNavigator.rangesToCsv(dataView.getItems(), columnDefinitions, selectedRanges);
if (csvText) {
clipboard.copyTextToClipboard(csvText);
}
diff --git a/web/pgadmin/static/js/selection/row_selector.js b/web/pgadmin/static/js/selection/row_selector.js
index 76a8c1a..b166edc 100644
--- a/web/pgadmin/static/js/selection/row_selector.js
+++ b/web/pgadmin/static/js/selection/row_selector.js
@@ -15,9 +15,11 @@ define(['jquery', 'sources/selection/range_selection_helper', 'slickgrid'], func
if (grid.getColumns()[args.cell].id === 'row-header-column') {
if (event.target.type != "checkbox") {
var checkbox = $(event.target).find('input[type="checkbox"]');
- toggleCheckbox($(checkbox));
+ if (checkbox.length > 0) {
+ toggleCheckbox($(checkbox));
+ updateRanges(grid, args.row);
+ }
}
- updateRanges(grid, args.row);
}
}
diff --git a/web/pgadmin/static/js/slickgrid/slick.pgadmin.editors.js b/web/pgadmin/static/js/slickgrid/slick.pgadmin.editors.js
index cdfba4d..dd10fc1 100644
--- a/web/pgadmin/static/js/slickgrid/slick.pgadmin.editors.js
+++ b/web/pgadmin/static/js/slickgrid/slick.pgadmin.editors.js
@@ -110,11 +110,11 @@
// When text editor opens
this.loadValue = function (item) {
- if (item[args.column.pos] === "") {
+ if (item[args.column.field] === "") {
$input.val("''");
}
else {
- $input.val(defaultValue = item[args.column.pos]);
+ $input.val(defaultValue = item[args.column.field]);
$input.select();
}
};
@@ -141,7 +141,7 @@
};
this.applyValue = function (item, state) {
- item[args.column.pos] = state;
+ item[args.column.field] = state;
};
this.isValueChanged = function () {
@@ -252,7 +252,7 @@
};
this.loadValue = function (item) {
- var data = defaultValue = item[args.column.pos];
+ var data = defaultValue = item[args.column.field];
if (typeof data === "object" && !Array.isArray(data)) {
data = JSON.stringify(data);
} else if (Array.isArray(data)) {
@@ -278,7 +278,7 @@
};
this.applyValue = function (item, state) {
- item[args.column.pos] = state;
+ item[args.column.field] = state;
};
this.isValueChanged = function () {
@@ -385,7 +385,7 @@
};
this.loadValue = function (item) {
- $input.val(defaultValue = item[args.column.pos]);
+ $input.val(defaultValue = item[args.column.field]);
$input.select();
};
@@ -394,7 +394,7 @@
};
this.applyValue = function (item, state) {
- item[args.column.pos] = state;
+ item[args.column.field] = state;
};
this.isValueChanged = function () {
@@ -468,12 +468,12 @@
};
this.loadValue = function (item) {
- defaultValue = item[args.column.pos];
+ defaultValue = item[args.column.field];
if (_.isNull(defaultValue)||_.isUndefined(defaultValue)) {
$select.prop('indeterminate', true);
}
else {
- defaultValue = !!item[args.column.pos];
+ defaultValue = !!item[args.column.field];
if (defaultValue) {
$select.prop('checked', true);
} else {
@@ -490,7 +490,7 @@
};
this.applyValue = function (item, state) {
- item[args.column.pos] = state;
+ item[args.column.field] = state;
};
this.isValueChanged = function () {
@@ -590,7 +590,7 @@
};
this.loadValue = function (item) {
- var data = defaultValue = item[args.column.pos];
+ var data = defaultValue = item[args.column.field];
if (typeof data === "object" && !Array.isArray(data)) {
data = JSON.stringify(data);
} else if (Array.isArray(data)) {
@@ -613,7 +613,7 @@
};
this.applyValue = function (item, state) {
- item[args.column.pos] = state;
+ item[args.column.field] = state;
};
this.isValueChanged = function () {
@@ -667,7 +667,7 @@
};
this.loadValue = function (item) {
- var value = item[args.column.pos];
+ var value = item[args.column.field];
// Check if value is null or undefined
if (value === undefined && typeof value === "undefined") {
@@ -819,7 +819,7 @@
};
this.loadValue = function (item) {
- defaultValue = item[args.column.pos];
+ defaultValue = item[args.column.field];
$input.val(defaultValue);
$input[0].defaultValue = defaultValue;
$input.select();
@@ -833,7 +833,7 @@
};
this.applyValue = function (item, state) {
- item[args.column.pos] = state;
+ item[args.column.field] = state;
};
this.isValueChanged = function () {
diff --git a/web/pgadmin/templates/base.html b/web/pgadmin/templates/base.html
index 375b39d..8be0205 100755
--- a/web/pgadmin/templates/base.html
+++ b/web/pgadmin/templates/base.html
@@ -144,6 +144,12 @@
],
"exports": 'Slick.Grid'
},
+ "slickgrid/slick.dataview": {
+ "deps": [
+ "slickgrid"
+ ],
+ "exports": 'Slick.Data.DataView'
+ },
"flotr2": {
deps: ['bean'],
exports: function(bean) {
diff --git a/web/pgadmin/tools/sqleditor/__init__.py b/web/pgadmin/tools/sqleditor/__init__.py
index d114988..fb484b9 100644
--- a/web/pgadmin/tools/sqleditor/__init__.py
+++ b/web/pgadmin/tools/sqleditor/__init__.py
@@ -26,7 +26,7 @@ from pgadmin.utils.sqlautocomplete.autocomplete import SQLAutoComplete
from pgadmin.misc.file_manager import Filemanager
-from config import PG_DEFAULT_DRIVER
+from config import PG_DEFAULT_DRIVER, ON_DEMAND_RECORD_COUNT
MODULE_NAME = 'sqleditor'
@@ -228,13 +228,32 @@ def start_view_data(trans_id):
# Check the transaction and connection status
status, error_msg, conn, trans_obj, session_obj = check_transaction_status(trans_id)
+
+ # get the default connection as current connection which is attached to
+ # trans id holds the cursor which has query result so we cannot use that
+ # connection to execute another query otherwise we'll lose query result.
+
+ manager = get_driver(PG_DEFAULT_DRIVER).connection_manager(trans_obj.sid)
+ default_conn = manager.connection(did=trans_obj.did)
+
+ # Connect to the Server if not connected.
+ if not default_conn.connected():
+ status, msg = default_conn.connect()
+ if not status:
+ return make_json_response(
+ data={'status': status, 'result': u"{}".format(msg)}
+ )
+
if status and conn is not None \
and trans_obj is not None and session_obj is not None:
try:
+ # set fetched row count to 0 as we are executing query again.
+ trans_obj.update_fetched_row_cnt(0)
+ session_obj['command_obj'] = pickle.dumps(trans_obj, -1)
# Fetch the sql and primary_keys from the object
sql = trans_obj.get_sql()
- pk_names, primary_keys = trans_obj.get_primary_keys()
+ pk_names, primary_keys = trans_obj.get_primary_keys(default_conn)
# Fetch the applied filter.
filter_applied = trans_obj.is_filter_applied()
@@ -302,6 +321,8 @@ def start_query_tool(trans_id):
# Use pickle.loads function to get the command object
session_obj = grid_data[str(trans_id)]
trans_obj = pickle.loads(session_obj['command_obj'])
+ # set fetched row count to 0 as we are executing query again.
+ trans_obj.update_fetched_row_cnt(0)
can_edit = False
can_filter = False
@@ -428,43 +449,6 @@ def preferences(trans_id):
return success_return()
[email protected]('/columns/<int:trans_id>', methods=["GET"])
-@login_required
-def get_columns(trans_id):
- """
- This method will returns list of columns of last async query.
-
- Args:
- trans_id: unique transaction id
- """
- columns = dict()
- columns_info = None
- primary_keys = None
- status, error_msg, conn, trans_obj, session_obj = check_transaction_status(trans_id)
- if status and conn is not None and session_obj is not None:
- # Check PK column info is available or not
- if 'primary_keys' in session_obj:
- primary_keys = session_obj['primary_keys']
-
- # Fetch column information
- columns_info = conn.get_column_info()
- if columns_info is not None:
- for col in columns_info:
- col_type = dict()
- col_type['type_code'] = col['type_code']
- col_type['type_name'] = None
- columns[col['name']] = col_type
-
- # As we changed the transaction object we need to
- # restore it and update the session variable.
- session_obj['columns_info'] = columns
- update_session_grid_transaction(trans_id, session_obj)
-
- return make_json_response(data={'status': True,
- 'columns': columns_info,
- 'primary_keys': primary_keys})
-
-
@blueprint.route('/poll/<int:trans_id>', methods=["GET"])
@login_required
def poll(trans_id):
@@ -476,12 +460,19 @@ def poll(trans_id):
"""
result = None
rows_affected = 0
+ rows_fetched_from = 0
+ rows_fetched_to = 0
+ has_more_rows = False
additional_result = []
+ columns = dict()
+ columns_info = None
+ primary_keys = None
+ types = {}
# Check the transaction and connection status
status, error_msg, conn, trans_obj, session_obj = check_transaction_status(trans_id)
if status and conn is not None and session_obj is not None:
- status, result = conn.poll(formatted_exception_msg=True)
+ status, result = conn.poll(formatted_exception_msg=True, no_result=True)
if not status:
return internal_server_error(result)
elif status == ASYNC_OK:
@@ -496,6 +487,45 @@ def poll(trans_id):
if (trans_status == TX_STATUS_INERROR and
trans_obj.auto_rollback):
conn.execute_void("ROLLBACK;")
+
+ st, result = conn.async_fetchmany_2darray(ON_DEMAND_RECORD_COUNT)
+ if st:
+ if 'primary_keys' in session_obj:
+ primary_keys = session_obj['primary_keys']
+
+ # Fetch column information
+ columns_info = conn.get_column_info()
+ if columns_info is not None:
+ for col in columns_info:
+ col_type = dict()
+ col_type['type_code'] = col['type_code']
+ col_type['type_name'] = None
+ columns[col['name']] = col_type
+
+ if columns:
+ session_obj['columns_info'] = columns
+ st, types = fetch_pg_types(columns, trans_obj)
+
+ if not st:
+ return internal_server_error(types)
+ # status of async_fetchmany_2darray is True and result is none
+ # means nothing to fetch
+ if result and rows_affected > -1:
+ res_len = len(result)
+ if res_len == ON_DEMAND_RECORD_COUNT:
+ has_more_rows = True
+
+ if res_len > 0:
+ rows_fetched_from = trans_obj.get_fetched_row_cnt()
+ trans_obj.update_fetched_row_cnt(rows_fetched_from + res_len)
+ rows_fetched_from += 1
+ rows_fetched_to = trans_obj.get_fetched_row_cnt()
+ session_obj['command_obj'] = pickle.dumps(trans_obj, -1)
+
+ # As we changed the transaction object we need to
+ # restore it and update the session variable.
+ update_session_grid_transaction(trans_id, session_obj)
+
elif status == ASYNC_EXECUTION_ABORTED:
status = 'Cancel'
else:
@@ -536,51 +566,94 @@ def poll(trans_id):
data={
'status': status, 'result': result,
'rows_affected': rows_affected,
- 'additional_messages': additional_messages
+ 'rows_fetched_from': rows_fetched_from,
+ 'rows_fetched_to': rows_fetched_to,
+ 'additional_messages': additional_messages,
+ 'has_more_rows': has_more_rows,
+ 'colinfo': columns_info,
+ 'primary_keys': primary_keys,
+ 'types': types
}
)
[email protected]('/fetch/types/<int:trans_id>', methods=["GET"])
[email protected]('/fetch/<int:trans_id>', methods=["GET"])
@login_required
-def fetch_pg_types(trans_id):
+def fetch(trans_id):
+ result = None
+ has_more_rows = False
+ rows_fetched_from = 0
+ rows_fetched_to = 0
+
+ # Check the transaction and connection status
+ status, error_msg, conn, trans_obj, session_obj = check_transaction_status(trans_id)
+ if status and conn is not None and session_obj is not None:
+ status, result = conn.async_fetchmany_2darray(ON_DEMAND_RECORD_COUNT)
+ if not status:
+ status = 'Error'
+ else:
+ status = 'Success'
+ res_len = len(result)
+ if res_len == ON_DEMAND_RECORD_COUNT:
+ has_more_rows = True
+
+ if res_len:
+ rows_fetched_from = trans_obj.get_fetched_row_cnt()
+ trans_obj.update_fetched_row_cnt(rows_fetched_from + res_len)
+ rows_fetched_from += 1
+ rows_fetched_to = trans_obj.get_fetched_row_cnt()
+ session_obj['command_obj'] = pickle.dumps(trans_obj, -1)
+ update_session_grid_transaction(trans_id, session_obj)
+ else:
+ status = 'NotConnected'
+ result = error_msg
+
+ return make_json_response(
+ data={
+ 'status': status, 'result': result,
+ 'has_more_rows': has_more_rows,
+ 'rows_fetched_from': rows_fetched_from,
+ 'rows_fetched_to': rows_fetched_to
+ }
+ )
+
+
+def fetch_pg_types(columns_info, trans_obj):
"""
This method is used to fetch the pg types, which is required
to map the data type comes as a result of the query.
Args:
- trans_id: unique transaction id
+ columns_info:
"""
- # Check the transaction and connection status
- status, error_msg, conn, trans_obj, session_obj = check_transaction_status(trans_id)
- if status and conn is not None \
- and trans_obj is not None and session_obj is not None:
- res = {}
- if 'columns_info' in session_obj \
- and session_obj['columns_info'] is not None:
+ # get the default connection as current connection attached to trans id
+ # holds the cursor which has query result so we cannot use that connection
+ # to execute another query otherwise we'll lose query result.
- oids = [session_obj['columns_info'][col]['type_code'] for col in session_obj['columns_info']]
+ manager = get_driver(PG_DEFAULT_DRIVER).connection_manager(trans_obj.sid)
+ default_conn = manager.connection(did=trans_obj.did)
- if oids:
- status, res = conn.execute_dict(
- u"""SELECT oid, format_type(oid,null) as typname FROM pg_type WHERE oid IN %s ORDER BY oid;
+ # Connect to the Server if not connected.
+ res = []
+ if not default_conn.connected():
+ status, msg = default_conn.connect()
+ if not status:
+ return status, msg
+
+ oids = [columns_info[col]['type_code'] for col in columns_info]
+
+ if oids:
+ status, res = default_conn.execute_dict(
+ u"""SELECT oid, format_type(oid,null) as typname FROM pg_type WHERE oid IN %s ORDER BY oid;
""", [tuple(oids)])
- if status:
- # iterate through pg_types and update the type name in session object
- for record in res['rows']:
- for col in session_obj['columns_info']:
- type_obj = session_obj['columns_info'][col]
- if type_obj['type_code'] == record['oid']:
- type_obj['type_name'] = record['typname']
+ if not status:
+ return False, res
- update_session_grid_transaction(trans_id, session_obj)
+ return status, res['rows']
else:
- status = False
- res = error_msg
-
- return make_json_response(data={'status': status, 'result': res})
+ return True, []
@blueprint.route('/save/<int:trans_id>', methods=["PUT", "POST"])
diff --git a/web/pgadmin/tools/sqleditor/command.py b/web/pgadmin/tools/sqleditor/command.py
index 1795155..89621c8 100644
--- a/web/pgadmin/tools/sqleditor/command.py
+++ b/web/pgadmin/tools/sqleditor/command.py
@@ -255,7 +255,21 @@ class SQLFilter(object):
return status, result
-class GridCommand(BaseCommand, SQLFilter):
+class FetchedRowTracker(object):
+ """
+ Keeps track of fetched row count.
+ """
+ def __init__(self, **kwargs):
+ self.fetched_rows = 0
+
+ def get_fetched_row_cnt(self):
+ return self.fetched_rows
+
+ def update_fetched_row_cnt(self, rows_cnt):
+ self.fetched_rows = rows_cnt
+
+
+class GridCommand(BaseCommand, SQLFilter, FetchedRowTracker):
"""
class GridCommand(object)
@@ -287,6 +301,7 @@ class GridCommand(BaseCommand, SQLFilter):
"""
BaseCommand.__init__(self, **kwargs)
SQLFilter.__init__(self, **kwargs)
+ FetchedRowTracker.__init__(self, **kwargs)
# Save the connection id, command type
self.conn_id = kwargs['conn_id'] if 'conn_id' in kwargs else None
@@ -296,10 +311,10 @@ class GridCommand(BaseCommand, SQLFilter):
if self.cmd_type == VIEW_FIRST_100_ROWS or self.cmd_type == VIEW_LAST_100_ROWS:
self.limit = 100
- def get_primary_keys(self):
+ def get_primary_keys(self, *args, **kwargs):
return None, None
- def save(self, changed_data):
+ def save(self, changed_data, default_conn=None):
return forbidden(errmsg=gettext("Data cannot be saved for the current object."))
def get_limit(self):
@@ -337,14 +352,14 @@ class TableCommand(GridCommand):
# call base class init to fetch the table name
super(TableCommand, self).__init__(**kwargs)
- def get_sql(self):
+ def get_sql(self, default_conn=None):
"""
This method is used to create a proper SQL query
to fetch the data for the specified table
"""
# Fetch the primary keys for the table
- pk_names, primary_keys = self.get_primary_keys()
+ pk_names, primary_keys = self.get_primary_keys(default_conn)
sql_filter = self.get_filter()
@@ -359,13 +374,16 @@ class TableCommand(GridCommand):
return sql
- def get_primary_keys(self):
+ def get_primary_keys(self, default_conn=None):
"""
This function is used to fetch the primary key columns.
"""
driver = get_driver(PG_DEFAULT_DRIVER)
- manager = driver.connection_manager(self.sid)
- conn = manager.connection(did=self.did, conn_id=self.conn_id)
+ if default_conn is None:
+ manager = driver.connection_manager(self.sid)
+ conn = manager.connection(did=self.did, conn_id=self.conn_id)
+ else:
+ conn = default_conn
pk_names = ''
primary_keys = dict()
@@ -397,7 +415,7 @@ class TableCommand(GridCommand):
def can_filter(self):
return True
- def save(self, changed_data):
+ def save(self, changed_data, default_conn=None):
"""
This function is used to save the data into the database.
Depending on condition it will either update or insert the
@@ -406,9 +424,12 @@ class TableCommand(GridCommand):
Args:
changed_data: Contains data to be saved
"""
-
- manager = get_driver(PG_DEFAULT_DRIVER).connection_manager(self.sid)
- conn = manager.connection(did=self.did, conn_id=self.conn_id)
+ driver = get_driver(PG_DEFAULT_DRIVER)
+ if default_conn is None:
+ manager = driver.connection_manager(self.sid)
+ conn = manager.connection(did=self.did, conn_id=self.conn_id)
+ else:
+ conn = default_conn
status = False
res = None
@@ -418,14 +439,6 @@ class TableCommand(GridCommand):
list_of_sql = []
_rowid = None
- # Replace column positions with names
- def set_column_names(data):
- new_data = {}
- for key in data:
- new_data[changed_data['columns'][int(key)]['name']] = data[key]
-
- return new_data
-
if conn.connected():
# Start the transaction
@@ -446,8 +459,7 @@ class TableCommand(GridCommand):
data = changed_data[of_type][each_row]['data']
# Remove our unique tracking key
data.pop('__temp_PK', None)
- data = set_column_names(data)
- data_type = set_column_names(changed_data[of_type][each_row]['data_type'])
+ data_type = changed_data[of_type][each_row]['data_type']
list_of_rowid.append(data.get('__temp_PK'))
sql = render_template("/".join([self.sql_path, 'insert.sql']),
@@ -461,9 +473,9 @@ class TableCommand(GridCommand):
# For updated rows
elif of_type == 'updated':
for each_row in changed_data[of_type]:
- data = set_column_names(changed_data[of_type][each_row]['data'])
- pk = set_column_names(changed_data[of_type][each_row]['primary_keys'])
- data_type = set_column_names(changed_data[of_type][each_row]['data_type'])
+ data = changed_data[of_type][each_row]['data']
+ pk = changed_data[of_type][each_row]['primary_keys']
+ data_type = changed_data[of_type][each_row]['data_type']
sql = render_template("/".join([self.sql_path, 'update.sql']),
data_to_be_saved=data,
primary_keys=pk,
@@ -483,18 +495,19 @@ class TableCommand(GridCommand):
rows_to_delete.append(changed_data[of_type][each_row])
# Fetch the keys for SQL generation
if is_first:
- # We need to covert dict_keys to normal list in Python3
- # In Python2, it's already a list & We will also fetch column names using index
- keys = [
- changed_data['columns'][int(k)]['name']
- for k in list(changed_data[of_type][each_row].keys())
- ]
+ # We need to covert dict_keys to normal list in
+ # Python3
+ # In Python2, it's already a list & We will also
+ # fetch column names using index
+ keys = list(changed_data[of_type][each_row].keys())
+
no_of_keys = len(keys)
is_first = False
# Map index with column name for each row
for row in rows_to_delete:
for k, v in row.items():
- # Set primary key with label & delete index based mapped key
+ # Set primary key with label & delete index based
+ # mapped key
try:
row[changed_data['columns'][int(k)]['name']] = v
except ValueError:
@@ -558,7 +571,7 @@ class ViewCommand(GridCommand):
# call base class init to fetch the table name
super(ViewCommand, self).__init__(**kwargs)
- def get_sql(self):
+ def get_sql(self, default_conn=None):
"""
This method is used to create a proper SQL query
to fetch the data for the specified view
@@ -613,7 +626,7 @@ class ForeignTableCommand(GridCommand):
# call base class init to fetch the table name
super(ForeignTableCommand, self).__init__(**kwargs)
- def get_sql(self):
+ def get_sql(self, default_conn=None):
"""
This method is used to create a proper SQL query
to fetch the data for the specified foreign table
@@ -658,7 +671,7 @@ class CatalogCommand(GridCommand):
# call base class init to fetch the table name
super(CatalogCommand, self).__init__(**kwargs)
- def get_sql(self):
+ def get_sql(self, default_conn=None):
"""
This method is used to create a proper SQL query
to fetch the data for the specified catalog object
@@ -683,7 +696,7 @@ class CatalogCommand(GridCommand):
return True
-class QueryToolCommand(BaseCommand):
+class QueryToolCommand(BaseCommand, FetchedRowTracker):
"""
class QueryToolCommand(BaseCommand)
@@ -693,13 +706,15 @@ class QueryToolCommand(BaseCommand):
def __init__(self, **kwargs):
# call base class init to fetch the table name
- super(QueryToolCommand, self).__init__(**kwargs)
+
+ BaseCommand.__init__(self, **kwargs)
+ FetchedRowTracker.__init__(self, **kwargs)
self.conn_id = None
self.auto_rollback = False
self.auto_commit = True
- def get_sql(self):
+ def get_sql(self, default_conn=None):
return None
def can_edit(self):
diff --git a/web/pgadmin/tools/sqleditor/templates/sqleditor/js/sqleditor.js b/web/pgadmin/tools/sqleditor/templates/sqleditor/js/sqleditor.js
index 2062aa2..10b3960 100644
--- a/web/pgadmin/tools/sqleditor/templates/sqleditor/js/sqleditor.js
+++ b/web/pgadmin/tools/sqleditor/templates/sqleditor/js/sqleditor.js
@@ -24,7 +24,8 @@ define(
'slickgrid/plugins/slick.cellselectionmodel',
'slickgrid/plugins/slick.cellcopymanager',
'slickgrid/plugins/slick.rowselectionmodel',
- 'slickgrid/slick.grid'
+ 'slickgrid/slick.grid',
+ 'slickgrid/slick.dataview'
],
function(
$, _, S, alertify, pgAdmin, Backbone, Backgrid, CodeMirror, pgExplain, GridSelector, clipboard, copyData
@@ -497,7 +498,7 @@ define(
- staged_rows:
This will hold all the data which user copies/pastes/deletes in grid
- deleted:
- This will hold all the data which user delets in grid
+ This will hold all the data which user deletes in grid
Events handling:
----------------
@@ -513,15 +514,6 @@ define(
- We are using this event for Copy operation on grid
*/
- // Get the item column value using a custom 'fieldIdx' column param
- get_item_column_value: function (item, column) {
- if (column.pos !== undefined) {
- return item[column.pos];
- } else {
- return null;
- }
- },
-
// This function is responsible to create and render the SlickGrid.
render_grid: function(collection, columns, is_editable) {
var self = this;
@@ -586,7 +578,7 @@ define(
});
var gridSelector = new GridSelector();
- grid_columns = gridSelector.getColumnDefinitionsWithCheckboxes(grid_columns);
+ grid_columns = self.grid_columns = gridSelector.getColumnDefinitionsWithCheckboxes(grid_columns);
var grid_options = {
editable: true,
@@ -594,8 +586,7 @@ define(
enableCellNavigation: true,
enableColumnReorder: false,
asyncEditorLoading: false,
- autoEdit: false,
- dataItemColumnValueExtractor: this.get_item_column_value
+ autoEdit: false
};
var $data_grid = self.$el.find('#datagrid');
@@ -603,14 +594,13 @@ define(
var grid_height = $($('#editor-panel').find('.wcFrame')[1]).height() - 35;
$data_grid.height(grid_height);
- // Add our own custom primary key to keep track of changes
- _.each(collection, function(row){
- row['__temp_PK'] = epicRandomString(15);
- });
+ var dataView = self.dataView = new Slick.Data.DataView(),
+ grid = self.grid = new Slick.Grid($data_grid, dataView, grid_columns, grid_options);
+
// Add-on function which allow us to identify the faulty row after insert/update
// and apply css accordingly
- collection.getItemMetadata = function(i) {
+ dataView.getItemMetadata = function(i) {
var res = {}, cssClass = '';
if (_.has(self.handler, 'data_store')) {
if (i in self.handler.data_store.added_index) {
@@ -626,9 +616,8 @@ define(
}
}
return {'cssClasses': cssClass};
- }
+ };
- var grid = new Slick.Grid($data_grid, collection, grid_columns, grid_options);
grid.registerPlugin( new Slick.AutoTooltips({ enableForHeaderCells: false }) );
grid.setSelectionModel(new Slick.RowSelectionModel({selectActiveRow: false}));
grid.registerPlugin(gridSelector);
@@ -646,57 +635,70 @@ define(
// Listener function to watch selected rows from grid
if (editor_data.selection) {
- editor_data.selection.onSelectedRangesChanged.subscribe(function(e, args) {
- var collection = this.grid.getData(),
- primary_key_list = _.keys(this.keys),
- _tmp_keys = [],
- _columns = this.columns,
- rows_for_stage = {}, selected_rows_list = [];
-
- // Only if entire row(s) are selected via check box
- if(_.has(this.selection, 'getSelectedRows')) {
- selected_rows_list = this.selection.getSelectedRows();
- // We will map selected row primary key name with position
- // For each Primary key
- _.each(primary_key_list, function(p) {
- // For each columns search primary key position
- _.each(_columns, function(c) {
- if(c.name == p) {
- _tmp_keys.push(c.pos);
- }
- });
- });
- // Now assign mapped temp PK to PK
- primary_key_list = _tmp_keys;
- }
-
- // If any row(s) selected ?
- if(selected_rows_list.length) {
- if(this.editor.handler.can_edit)
- // Enable delete rows button
- $("#btn-delete-row").prop('disabled', false);
-
- // Enable copy rows button
- $("#btn-copy-row").prop('disabled', false);
- // Collect primary key data from collection as needed for stage row
- _.each(selected_rows_list, function(row_index) {
- var row_data = collection[row_index];
- // Store Primary key data for selected rows
- rows_for_stage[row_data.__temp_PK] = _.pick(row_data, primary_key_list);
- });
- } else {
- // Clear the object as no rows to delete
- rows_for_stage = {};
- // Disable delete/copy rows button
- $("#btn-delete-row").prop('disabled', true);
- $("#btn-copy-row").prop('disabled', true);
- }
+ editor_data.selection.onSelectedRangesChanged.subscribe(function(e, args) {
+ var dataView = this.grid.getData(),
+ collection = dataView.getItems(),
+ primary_key_list = _.keys(this.keys),
+ _tmp_keys = [],
+ _columns = this.columns,
+ rows_for_stage = {},
+ selected_rows_list = [];
+
+ // Only if entire row(s) are selected via check box
+ if(_.has(this.selection, 'getSelectedRows')) {
+ selected_rows_list = this.selection.getSelectedRows();
+ // We will map selected row primary key name with position
+ // For each Primary key
+ _.each(primary_key_list, function(p) {
+ // For each columns search primary key position
+ _.each(_columns, function(c) {
+ if(c.name == p) {
+ _tmp_keys.push(c.name);
+ }
+ });
+ });
+ // Now assign mapped temp PK to PK
+ primary_key_list = _tmp_keys;
+ }
- // Update main data store
- this.editor.handler.data_store.staged_rows = rows_for_stage;
- }.bind(editor_data));
+ // If any row(s) selected ?
+ if(selected_rows_list.length) {
+ if(this.editor.handler.can_edit)
+ // Enable delete rows button
+ $("#btn-delete-row").prop('disabled', false);
+
+ // Enable copy rows button
+ $("#btn-copy-row").prop('disabled', false);
+ // Collect primary key data from collection as needed for stage row
+ _.each(selected_rows_list, function(row_index) {
+ var row_data = collection[row_index];
+ // Store Primary key data for selected rows
+ rows_for_stage[row_data.__temp_PK] = _.pick(row_data, primary_key_list);
+ });
+ } else {
+ // Clear the object as no rows to delete
+ rows_for_stage = {};
+ // Disable delete/copy rows button
+ $("#btn-delete-row").prop('disabled', true);
+ $("#btn-copy-row").prop('disabled', true);
+ }
+
+ // Update main data store
+ this.editor.handler.data_store.staged_rows = rows_for_stage;
+ }.bind(editor_data));
}
+ // listen for row count change.
+ dataView.onRowCountChanged.subscribe(function (e, args) {
+ grid.updateRowCount();
+ grid.render();
+ });
+
+ // listen for row change.
+ dataView.onRowsChanged.subscribe(function (e, args) {
+ grid.invalidateRows(args.rows);
+ grid.render();
+ });
// Listener function which will be called before user updates existing cell
// This will be used to collect primary key for that row
@@ -715,8 +717,7 @@ define(
// Fetch primary keys for the row before they gets modified
var _columns = self.handler.columns;
_.each(_keys, function(value, key) {
- pos = _.where(_columns, {name: key})[0]['pos']
- current_pk[pos] = before_data[pos];
+ current_pk[key] = before_data[key];
});
// Place it in main variable for later use
self.handler.primary_keys_data[_pk] = current_pk
@@ -748,7 +749,7 @@ define(
// Fetch current row data from grid
column_values = grid.getDataItem(row, cell)
// Get the value from cell
- value = column_values[column_info.pos] || '';
+ value = column_values[column_info.field] || '';
// Copy this value to Clipboard
if(value)
clipboard.copyTextToClipboard(value);
@@ -762,7 +763,7 @@ define(
// Listener function which will be called when user updates existing rows
grid.onCellChange.subscribe(function (e, args) {
// self.handler.data_store.updated will holds all the updated data
- var changed_column = args.grid.getColumns()[args.cell].pos, // Current field pos
+ var changed_column = args.grid.getColumns()[args.cell].field, // Current field pos
updated_data = args.item[changed_column], // New value for current field
_pk = args.item.__temp_PK || null, // Unique key to identify row
column_data = {},
@@ -778,7 +779,7 @@ define(
column_data);
//Find type for current column
self.handler.data_store.added[_pk]['err'] = false
- self.handler.data_store.added[_pk]['data_type'][changed_column] = _.where(this.columns, {pos: changed_column})[0]['type'];
+ self.handler.data_store.added[_pk]['data_type'][changed_column] = _.where(this.columns, {name: changed_column})[0]['type'];
// Check if it is updated data from existing rows?
} else if(_pk in self.handler.data_store.updated) {
_.extend(
@@ -788,7 +789,7 @@ define(
self.handler.data_store.updated[_pk]['err'] = false
//Find type for current column
- self.handler.data_store.updated[_pk]['data_type'][changed_column] = _.where(this.columns, {pos: changed_column})[0]['type'];
+ self.handler.data_store.updated[_pk]['data_type'][changed_column] = _.where(this.columns, {name: changed_column})[0]['type'];
} else {
// First updated data for this primary key
self.handler.data_store.updated[_pk] = {
@@ -798,7 +799,7 @@ define(
self.handler.data_store.updated_index[args.row] = _pk;
// Find & add column data type for current changed column
var temp = {};
- temp[changed_column] = _.where(this.columns, {pos: changed_column})[0]['type'];
+ temp[changed_column] = _.where(this.columns, {name: changed_column})[0]['type'];
self.handler.data_store.updated[_pk]['data_type'] = temp;
}
}
@@ -811,25 +812,37 @@ define(
// self.handler.data_store.added will holds all the newly added rows/data
var _key = epicRandomString(10),
column = args.column,
- item = args.item, data_length = this.grid.getDataLength();
+ item = args.item, data_length = this.grid.getDataLength(),
+ dataView = this.grid.getData();
if(item) {
item.__temp_PK = _key;
}
- collection.push(item);
+
+ dataView.addItem(item);
self.handler.data_store.added[_key] = {'err': false, 'data': item};
self.handler.data_store.added_index[data_length] = _key;
// Fetch data type & add it for the column
var temp = {};
- temp[column.pos] = _.where(this.columns, {pos: column.pos})[0]['type'];
+ temp[column.field] = _.where(this.columns, {pos: column.pos})[0]['type'];
self.handler.data_store.added[_key]['data_type'] = temp;
- grid.invalidateRows([collection.length - 1]);
+ grid.invalidateRows([data_length - 1]);
grid.updateRowCount();
grid.render();
// Enable save button
$("#btn-save").prop('disabled', false);
}.bind(editor_data));
+ // Listen grid viewportChanged event to load next chunk of data.
+ grid.onViewportChanged.subscribe(function(e, args) {
+ var rendered_range = args.grid.getRenderedRange(),
+ data_len = args.grid.getDataLength();
+ // start fetching next batch of records before reaching to bottom.
+ if (self.handler.has_more_rows && !self.handler.fetching_rows && rendered_range.bottom > data_len - 100) {
+ // fetch asynchronous
+ setTimeout(self.fetch_next.bind(self));
+ }
+ })
// Resize SlickGrid when window resize
$( window ).resize( function() {
// Resize grid only when 'Data Output' panel is visible.
@@ -852,6 +865,65 @@ define(
if(self.data_output_panel.isVisible())
self.grid_resize(grid);
});
+
+ for (var i = 0; i < collection.length; i++) {
+ // Convert to dict from 2darray
+ var item = {};
+ for (var j = 1; j < grid_columns.length; j++) {
+ item[grid_columns[j]['field']] = collection[i][grid_columns[j]['pos']]
+ }
+
+ item['__temp_PK'] = i.toString();
+ collection[i] = item;
+ }
+ dataView.setItems(collection, '__temp_PK');
+ },
+
+ fetch_next: function() {
+ var self = this;
+ // This will prevent fetch operation if previous fetch operation is
+ // already in progress.
+ self.fetching_rows = true;
+
+ $.ajax({
+ url: "{{ url_for('sqleditor.index') }}" + "fetch/" + self.transId,
+ method: 'GET',
+ success: function(res) {
+ self.handler.has_more_rows = res.data.has_more_rows;
+ self.update_grid_data(res.data.result);
+ },
+ error: function(e) {
+ self.handler.has_more_rows = false;
+ self.handler.fetching_rows = false;
+
+ if (e.readyState == 0) {
+ self.update_msg_history(false,
+ "{{ _('Not connected to the server or the connection to the server has been closed.') }}"
+ );
+ return;
+ }
+ }
+ });
+ },
+
+ update_grid_data: function(data) {
+ var data_len = this.grid.getDataLength();
+
+ this.dataView.beginUpdate();
+
+ for (var i = 0; i < data.length; i++) {
+ // Convert 2darray to dict.
+ var item = {};
+ for (var j = 1; j < this.grid_columns.length; j++) {
+ item[this.grid_columns[j]['field']] = data[i][this.grid_columns[j]['pos']]
+ }
+
+ item['__temp_PK'] = (data_len + i).toString();
+ this.dataView.addItem(item);
+ }
+
+ this.dataView.endUpdate();
+ this.handler.fetching_rows = false;
},
/* This function is responsible to render output grid */
@@ -1526,6 +1598,8 @@ define(
self.explain_buffers = false;
self.explain_timing = false;
self.is_new_browser_tab = is_new_browser_tab;
+ self.has_more_rows = false;
+ self.fetching_rows = false;
// We do not allow to call the start multiple times.
if (self.gridView)
@@ -1629,6 +1703,8 @@ define(
self.query_start_time = new Date();
self.rows_affected = 0;
self._init_polling_flags();
+ self.has_more_rows = false;
+ self.fetching_rows = false;
self.trigger(
'pgadmin-sqleditor:loading-icon:show',
@@ -1707,45 +1783,16 @@ define(
});
},
- // This function makes the ajax call to fetch columns for last async query,
- get_columns: function(poll_result) {
- var self = this;
- // Check the flag and decide if we need to fetch columns from server
- // or use the columns data stored locally from previous call?
- if (self.FETCH_COLUMNS_FROM_SERVER) {
- $.ajax({
- url: "{{ url_for('sqleditor.index') }}" + "columns/" + self.transId,
- method: 'GET',
- success: function(res) {
- poll_result.colinfo = res.data.columns;
- poll_result.primary_keys = res.data.primary_keys;
- self.call_render_after_poll(poll_result);
- // Set a flag to get columns to false & set the value for future use
- self.FETCH_COLUMNS_FROM_SERVER = false;
- self.COLUMNS_DATA = res;
- },
- error: function(e) {
- var msg = e.responseText;
- if (e.responseJSON != undefined && e.responseJSON.errormsg != undefined)
- msg = e.responseJSON.errormsg;
- alertify.error(msg, 5);
- }
- });
- } else {
- // Use the previously saved columns data
- poll_result.colinfo = self.COLUMNS_DATA.data.columns;
- poll_result.primary_keys = self.COLUMNS_DATA.data.primary_keys;
- self.call_render_after_poll(poll_result);
- }
- },
-
// This is a wrapper to call _render function
// We need this because we have separated columns route & result route
// We need to combine both result here in wrapper before rendering grid
call_render_after_poll: function(res) {
var self = this;
self.query_end_time = new Date();
- self.rows_affected = res.rows_affected;
+ self.rows_affected = res.rows_affected,
+ self.rows_fetched_from = res.rows_fetched_from,
+ self.rows_fetched_to = res.rows_fetched_to,
+ self.has_more_rows = res.has_more_rows;
/* If no column information is available it means query
runs successfully with no result to display. In this
@@ -1794,7 +1841,8 @@ define(
'pgadmin-sqleditor:loading-icon:message',
"{{ _('Loading data from the database server and rendering...') }}"
);
- self.get_columns(res.data);
+
+ self.call_render_after_poll(res.data);
}
else if (res.data.status === 'Busy') {
// If status is Busy then poll the result by recursive call to the poll function
@@ -1966,132 +2014,113 @@ define(
// Hide the loading icon
self.trigger('pgadmin-sqleditor:loading-icon:hide');
$("#btn-flash").prop('disabled', false);
- }.bind(self),
- function() {
- this.trigger('pgadmin-sqleditor:loading-icon:hide');
- $("#btn-flash").prop('disabled', false);
}.bind(self)
);
},
// This function creates the columns as required by the backgrid
- _fetch_column_metadata: function(data, cb, _fail) {
+ _fetch_column_metadata: function(data, cb) {
var colinfo = data.colinfo,
primary_keys = data.primary_keys,
result = data.result,
columns = [],
self = this;
- self.trigger(
- 'pgadmin-sqleditor:loading-icon:message',
- "{{ _('Retrieving information about the columns returned...') }}"
- );
+ // Store pg_types in an array
+ var pg_types = new Array();
+ _.each(data.types, function(r) {
+ pg_types[r.oid] = [r.typname];
+ });
- // Make ajax call to fetch the pg types to map numeric data type
- $.ajax({
- url: "{{ url_for('sqleditor.index') }}" + "fetch/types/" + self.transId,
- method: 'GET',
- success: function(res) {
- if (res.data.status) {
- // Store pg_types in an array
- var pg_types = new Array();
- _.each(res.data.result.rows, function(r) {
- pg_types[r.oid] = [r.typname];
- });
+ // Create columns required by slick grid to render
+ _.each(colinfo, function(c) {
+ var is_primary_key = false;
- // Create columns required by backgrid to render
- _.each(colinfo, function(c) {
- var is_primary_key = false;
+ // Check whether table have primary key
+ if (_.size(primary_keys) > 0) {
+ _.each(primary_keys, function (value, key) {
+ if (key === c.name)
+ is_primary_key = true;
+ });
+ }
- // Check whether table have primary key
- if (_.size(primary_keys) > 0) {
- _.each(primary_keys, function (value, key) {
- if (key === c.name)
- is_primary_key = true;
- });
- }
+ // To show column label and data type in multiline,
+ // The elements should be put inside the div.
+ // Create column label and type.
+ var col_type = column_label = '';
+ var type = pg_types[c.type_code] ?
+ pg_types[c.type_code][0] :
+ // This is the case where user might
+ // have use casting so we will use type
+ // returned by cast function
+ pg_types[pg_types.length - 1][0] ?
+ pg_types[pg_types.length - 1][0] : 'unknown';
+
+ if (!is_primary_key)
+ col_type += ' ' + type;
+ else
+ col_type += ' [PK] ' + type;
- // To show column label and data type in multiline,
- // The elements should be put inside the div.
- // Create column label and type.
- var col_type = column_label = '';
- var type = pg_types[c.type_code] ?
- pg_types[c.type_code][0] :
- // This is the case where user might
- // have use casting so we will use type
- // returned by cast function
- pg_types[pg_types.length - 1][0] ?
- pg_types[pg_types.length - 1][0] : 'unknown';
-
- if (!is_primary_key)
- col_type += ' ' + type;
- else
- col_type += ' [PK] ' + type;
+ if (c.precision && c.precision >= 0 && c.precision != 65535) {
+ col_type += ' (' + c.precision;
+ col_type += c.scale && c.scale != 65535 ?
+ ',' + c.scale + ')':
+ ')';
+ }
- if (c.precision && c.precision >= 0 && c.precision != 65535) {
- col_type += ' (' + c.precision;
- col_type += c.scale && c.scale != 65535 ?
- ',' + c.scale + ')':
- ')';
- }
+ // Identify cell type of column.
+ switch(type) {
+ case "json":
+ case "json[]":
+ case "jsonb":
+ case "jsonb[]":
+ col_cell = 'Json';
+ break;
+ case "smallint":
+ case "integer":
+ case "bigint":
+ case "decimal":
+ case "numeric":
+ case "real":
+ case "double precision":
+ col_cell = 'number';
+ break;
+ case "boolean":
+ col_cell = 'boolean';
+ break;
+ case "character":
+ case "character[]":
+ case "character varying":
+ case "character varying[]":
+ if (c.internal_size && c.internal_size >= 0 && c.internal_size != 65535) {
+ // Update column type to display length on column header
+ col_type += ' (' + c.internal_size + ')';
+ }
+ col_cell = 'string';
+ break;
+ default:
+ col_cell = 'string';
+ }
- // Identify cell type of column.
- switch(type) {
- case "json":
- case "json[]":
- case "jsonb":
- case "jsonb[]":
- col_cell = 'Json';
- break;
- case "smallint":
- case "integer":
- case "bigint":
- case "decimal":
- case "numeric":
- case "real":
- case "double precision":
- col_cell = 'number';
- break;
- case "boolean":
- col_cell = 'boolean';
- break;
- case "character":
- case "character[]":
- case "character varying":
- case "character varying[]":
- if (c.internal_size && c.internal_size >= 0 && c.internal_size != 65535) {
- // Update column type to display length on column header
- col_type += ' (' + c.internal_size + ')';
- }
- col_cell = 'string';
- break;
- default:
- col_cell = 'string';
- }
+ column_label = c.display_name + '<br>' + col_type;
- column_label = c.display_name + '<br>' + col_type;
-
- var col = {
- 'name': c.name,
- 'pos': c.pos,
- 'label': column_label,
- 'cell': col_cell,
- 'can_edit': self.can_edit,
- 'type': type
- };
- columns.push(col);
- });
- }
- else {
- alertify.alert('Fetching Type Error', res.data.result);
- }
- self.columns = columns;
- if (cb && typeof(cb) == 'function') {
- cb();
- }
- },
- fail: _fail
+ var col = {
+ 'name': c.name,
+ 'pos': c.pos,
+ 'label': column_label,
+ 'cell': col_cell,
+ 'can_edit': self.can_edit,
+ 'type': type
+ };
+ columns.push(col);
});
+
+ self.columns = columns;
+ if (cb && typeof(cb) == 'function') {
+ cb();
+ }
+
+
},
// This function is used to raise appropriate message.
@@ -2102,22 +2131,21 @@ define(
self.gridView.messages_panel.focus();
- if (self.is_query_tool) {
- if (clear_grid) {
- // Delete grid
- if (self.gridView.handler.slickgrid) {
- self.gridView.handler.slickgrid.destroy();
-
- }
- // Misc cleaning
- self.columns = undefined;
- self.collection = undefined;
+ if (clear_grid) {
+ // Delete grid
+ if (self.gridView.handler.slickgrid) {
+ self.gridView.handler.slickgrid.destroy();
- $('.sql-editor-message').text(msg);
- } else {
- $('.sql-editor-message').append(msg);
}
+ // Misc cleaning
+ self.columns = undefined;
+ self.collection = undefined;
+
+ $('.sql-editor-message').text(msg);
+ } else {
+ $('.sql-editor-message').append(msg);
}
+
// Scroll automatically when msgs appends to element
setTimeout(function(){
$(".sql-editor-message").scrollTop($(".sql-editor-message")[0].scrollHeight);;
@@ -2174,67 +2202,71 @@ define(
is_added = _.size(self.data_store.added),
is_updated = _.size(self.data_store.updated);
- // Remove newly added rows from staged rows as we don't want to send them on server
- if(is_added) {
- _.each(self.data_store.added, function(val, key) {
- if(key in self.data_store.staged_rows) {
- // Remove the row from data store so that we do not send it on server
- deleted_keys.push(key);
- delete self.data_store.staged_rows[key];
- delete self.data_store.added[key]
- }
- });
+ // Remove newly added rows from staged rows as we don't want to send them on server
+ if(is_added) {
+ _.each(self.data_store.added, function(val, key) {
+ if(key in self.data_store.staged_rows) {
+ // Remove the row from data store so that we do not send it on server
+ deleted_keys.push(key);
+ delete self.data_store.staged_rows[key];
+ delete self.data_store.added[key]
}
+ });
+ }
- // If only newly rows to delete and no data is there to send on server
- // then just re-render the grid
- if(_.size(self.data_store.staged_rows) == 0) {
- var grid = self.slickgrid, data = grid.getData(), idx = 0;
- if(deleted_keys.length){
- // Remove new rows from grid data using deleted keys
- data = _.reject(data, function(d){
- return (d && _.indexOf(deleted_keys, d.__temp_PK) > -1)
- });
- }
- grid.resetActiveCell();
- grid.setData(data, true);
- grid.setSelectedRows([]);
- grid.invalidate();
- // Nothing to copy or delete here
- $("#btn-delete-row").prop('disabled', true);
- $("#btn-copy-row").prop('disabled', true);
- if(_.size(self.data_store.added) || is_updated) {
- // Do not disable save button if there are
- // any other changes present in grid data
- $("#btn-save").prop('disabled', false);
- } else {
- $("#btn-save").prop('disabled', true);
- }
- alertify.success("{{ _('Row(s) deleted') }}");
- } else {
- // There are other data to needs to be updated on server
- if(is_updated) {
- alertify.alert("{{ _('Operation failed') }}",
- "{{ _('There are unsaved changes in grid, Please save them first to avoid inconsistency in data') }}"
- );
- return;
- }
- alertify.confirm("{{ _('Delete Row(s)') }}",
- "{{ _('Are you sure you wish to delete selected row(s)?') }}",
- function() {
- $("#btn-delete-row").prop('disabled', true);
- $("#btn-copy-row").prop('disabled', true);
- // Change the state
- self.data_store.deleted = self.data_store.staged_rows;
- self.data_store.staged_rows = {};
- // Save the changes on server
- self._save();
- },
- function() {
- // Do nothing as user canceled the operation.
- }
- ).set('labels', {ok:'Yes', cancel:'No'});
+ // If only newly rows to delete and no data is there to send on server
+ // then just re-render the grid
+ if(_.size(self.data_store.staged_rows) == 0) {
+ var grid = self.slickgrid,
+ dataView = grid.getData(),
+ data = dataView.getItems(),
+ idx = 0;
+
+ if(deleted_keys.length){
+ // Remove new rows from grid data using deleted keys
+ data = _.reject(data, function(d){
+ return (d && _.indexOf(deleted_keys, d.__temp_PK) > -1)
+ });
+ }
+ grid.resetActiveCell();
+ grid.setData(data, true);
+ grid.setSelectedRows([]);
+ grid.invalidate();
+ // Nothing to copy or delete here
+ $("#btn-delete-row").prop('disabled', true);
+ $("#btn-copy-row").prop('disabled', true);
+ if(_.size(self.data_store.added) || is_updated) {
+ // Do not disable save button if there are
+ // any other changes present in grid data
+ $("#btn-save").prop('disabled', false);
+ } else {
+ $("#btn-save").prop('disabled', true);
+ }
+ alertify.success("{{ _('Row(s) deleted') }}");
+ } else {
+ // There are other data to needs to be updated on server
+ if(is_updated) {
+ alertify.alert("{{ _('Operation failed') }}",
+ "{{ _('There are unsaved changes in grid, Please save them first to avoid inconsistency in data') }}"
+ );
+ return;
+ }
+ alertify.confirm("{{ _('Delete Row(s)') }}",
+ "{{ _('Are you sure you wish to delete selected row(s)?') }}",
+ function() {
+ $("#btn-delete-row").prop('disabled', true);
+ $("#btn-copy-row").prop('disabled', true);
+ // Change the state
+ self.data_store.deleted = self.data_store.staged_rows;
+ self.data_store.staged_rows = {};
+ // Save the changes on server
+ self._save();
+ },
+ function() {
+ // Do nothing as user canceled the operation.
}
+ ).set('labels', {ok:'Yes', cancel:'No'});
+ }
},
@@ -2297,26 +2329,24 @@ define(
data: JSON.stringify(req_data),
success: function(res) {
var grid = self.slickgrid,
- data = grid.getData();
+ dataView = grid.getData(),
+ data = dataView.getItems();
if (res.data.status) {
// Remove deleted rows from client as well
if(is_deleted) {
var rows = grid.getSelectedRows();
- /* In JavaScript sorting by default is lexical,
- * To make sorting numerical we need to pass function
- * After that we will Reverse the order of sorted array
- * so that when we remove it does not affect array index
- */
if(data.length == rows.length) {
// This means all the rows are selected, clear all data
data = [];
+ dataView.setItems(data, '__temp_PK');
} else {
- rows = rows.sort(function(a,b){return a - b}).reverse();
- rows.forEach(function(idx) {
- data.splice(idx, 1);
- });
+ dataView.beginUpdate();
+ for (var i = 0; i < rows.length; i++) {
+ item = grid.getDataItem(rows[i]);
+ dataView.deleteItem(item['__temp_PK']);
+ }
+ dataView.endUpdate();
}
- grid.setData(data, true);
grid.setSelectedRows([]);
}
@@ -2387,9 +2417,13 @@ define(
// Find index of row at fault from grid data
_find_rowindex: function(rowid) {
- var self = this;
- var grid = self.slickgrid,
- data = grid.getData(), _rowid, count = 0, _idx = -1;
+ var self = this,
+ grid = self.slickgrid,
+ dataView = grid.getData(),
+ data = dataView.getItems(),
+ _rowid,
+ count = 0,
+ _idx = -1;
// If _rowid is object then it's update/delete operation
if(_.isObject(rowid)) {
_rowid = rowid;
@@ -2616,11 +2650,6 @@ define(
// This function will set the required flag for polling response data
_init_polling_flags: function() {
var self = this;
- // Set a flag to get columns
- self.FETCH_COLUMNS_FROM_SERVER = true;
- // We will set columns data in this variable for future use once we fetch it
- // from server
- self.COLUMNS_DATA = {};
// To get a timeout for polling fallback timer in seconds in
// regards to elapsed time
@@ -2716,7 +2745,7 @@ define(
return;
// Add column position and it's value to data
- data[column_info.field] = _values[column_info.pos] || '';
+ data[column_info.field] = _values[column_info.field] || '';
self.trigger(
'pgadmin-sqleditor:loading-icon:show',
@@ -2786,7 +2815,7 @@ define(
return;
// Add column position and it's value to data
- data[column_info.field] = _values[column_info.pos] || '';
+ data[column_info.field] = _values[column_info.field] || '';
self.trigger(
'pgadmin-sqleditor:loading-icon:show',
@@ -2947,60 +2976,63 @@ define(
// This function will paste the selected row.
_paste_row: function() {
- var self = this, col_info = {},
- grid = self.slickgrid,
- data = grid.getData();
- // Deep copy
- var copied_rows = $.extend(true, [], self.copied_rows),
- _tmp_copied_row = {};
+ var self = this,
+ col_info = {},
+ grid = self.slickgrid,
+ dataView = grid.getData(),
+ data = dataView.getItems(),
+ // Deep copy
+ copied_rows = $.extend(true, [], self.copied_rows),
+ _tmp_copied_row = {};
// If there are rows to paste?
- if(copied_rows.length > 0) {
- // Enable save button so that user can
- // save newly pasted rows on server
- $("#btn-save").prop('disabled', false);
- // Generate Unique key for each pasted row(s)
- _.each(copied_rows, function(row) {
- var _pk = epicRandomString(8);
- row.__temp_PK = _pk;
- });
- data = data.concat(copied_rows);
- grid.setData(data, true);
- grid.updateRowCount();
- grid.setSelectedRows([]);
- grid.invalidateAllRows();
- grid.render();
-
- // Fetch column name & its data type
- _.each(self.columns, function(c) {
- col_info[String(c.pos)] = c.type;
- });
+ if(copied_rows.length > 0) {
+ // Enable save button so that user can
+ // save newly pasted rows on server
+ $("#btn-save").prop('disabled', false);
+ // Generate Unique key for each pasted row(s)
+ _.each(copied_rows, function(row) {
+ var _pk = epicRandomString(8);
+ row.__temp_PK = _pk;
+ });
- // insert these data in data_store as well to save them on server
- for (var j = 0; j < copied_rows.length; j += 1) {
- self.data_store.added[copied_rows[j].__temp_PK] = {
- 'data_type': {},
- 'data': {}
- };
- self.data_store.added[copied_rows[j].__temp_PK]['data_type'] = col_info;
- // We need to convert it from array to dict so that server can
- // understand the data properly
- _.each(copied_rows[j], function(val, key) {
- // If value is array then convert it to string
- if(_.isArray(val)) {
- _tmp_copied_row[String(key)] = val.toString();
- // If value is object then stringify it
- } else if(_.isObject(val)) {
- _tmp_copied_row[j][String(key)] = JSON.stringify(val);
- } else {
- _tmp_copied_row[String(key)] = val;
- }
- });
- self.data_store.added[copied_rows[j].__temp_PK]['data'] = _tmp_copied_row;
- // reset the variable
- _tmp_copied_row = {};
- }
+ dataView.beginUpdate();
+ for (var i = 0; i < copied_rows.length; i++) {
+ dataView.addItem(copied_rows[i]);
}
+ dataView.endUpdate();
+ grid.setSelectedRows([]);
+
+ // Fetch column name & its data type
+ _.each(self.columns, function(c) {
+ col_info[String(c.name)] = c.type;
+ });
+
+ // insert these data in data_store as well to save them on server
+ for (var j = 0; j < copied_rows.length; j += 1) {
+ self.data_store.added[copied_rows[j].__temp_PK] = {
+ 'data_type': {},
+ 'data': {}
+ };
+ self.data_store.added[copied_rows[j].__temp_PK]['data_type'] = col_info;
+ // We need to convert it from array to dict so that server can
+ // understand the data properly
+ _.each(copied_rows[j], function(val, key) {
+ // If value is array then convert it to string
+ if(_.isArray(val)) {
+ _tmp_copied_row[String(key)] = val.toString();
+ // If value is object then stringify it
+ } else if(_.isObject(val)) {
+ _tmp_copied_row[j][String(key)] = JSON.stringify(val);
+ } else {
+ _tmp_copied_row[String(key)] = val;
+ }
+ });
+ self.data_store.added[copied_rows[j].__temp_PK]['data'] = _tmp_copied_row;
+ // reset the variable
+ _tmp_copied_row = {};
+ }
+ }
},
// This function will set the limit for SQL query
@@ -3070,6 +3102,9 @@ define(
sql = '',
history_msg = '';
+ self.has_more_rows = false;
+ self.fetching_rows = false;
+
/* If code is selected in the code mirror then execute
* the selected part else execute the complete code.
*/
diff --git a/web/pgadmin/utils/driver/abstract.py b/web/pgadmin/utils/driver/abstract.py
index 9b2363c..7db3e37 100644
--- a/web/pgadmin/utils/driver/abstract.py
+++ b/web/pgadmin/utils/driver/abstract.py
@@ -101,6 +101,12 @@ class BaseConnection(object):
- Implement this method to execute the given query and returns the result
as an array of dict (column name -> value) format.
+ * def async_fetchmany_2darray(records=-1, formatted_exception_msg=False):
+ - Implement this method to retrieve result of asynchronous connection and
+ polling with no_result flag set to True.
+ This returns the result as a 2 dimensional array.
+ If records is -1 then fetchmany will behave as fetchall.
+
* connected()
- Implement this method to get the status of the connection. It should
return True for connected, otherwise False
@@ -133,7 +139,7 @@ class BaseConnection(object):
- Implement this method to wait for asynchronous connection with timeout.
This must be a non blocking call.
- * poll(formatted_exception_msg)
+ * poll(formatted_exception_msg, no_result)
- Implement this method to poll the data of query running on asynchronous
connection.
@@ -180,6 +186,10 @@ class BaseConnection(object):
pass
@abstractmethod
+ def async_fetchmany_2darray(self, records=-1, formatted_exception_msg=False):
+ pass
+
+ @abstractmethod
def connected(self):
pass
@@ -208,7 +218,7 @@ class BaseConnection(object):
pass
@abstractmethod
- def poll(self, formatted_exception_msg=True):
+ def poll(self, formatted_exception_msg=True, no_result=False):
pass
@abstractmethod
diff --git a/web/pgadmin/utils/driver/psycopg2/__init__.py b/web/pgadmin/utils/driver/psycopg2/__init__.py
index e474817..c48ab43 100644
--- a/web/pgadmin/utils/driver/psycopg2/__init__.py
+++ b/web/pgadmin/utils/driver/psycopg2/__init__.py
@@ -973,6 +973,55 @@ Failed to execute query (execute_void) for the server #{server_id} - {conn_id}
return True, {'columns': columns, 'rows': rows}
+ def async_fetchmany_2darray(self, records=2000, formatted_exception_msg=False):
+ """
+ User should poll and check if status is ASYNC_OK before calling this
+ function
+ Args:
+ records: no of records to fetch. use -1 to fetchall.
+ formatted_exception_msg:
+
+ Returns:
+
+ """
+ cur = self.__async_cursor
+ if not cur:
+ return False, gettext(
+ "Cursor could not be found for the async connection."
+ )
+
+ if self.conn.isexecuting():
+ return False, gettext(
+ "Asynchronous query execution/operation underway."
+ )
+
+ if self.row_count > 0:
+ result = []
+ # For DDL operation, we may not have result.
+ #
+ # Because - there is not direct way to differentiate DML and
+ # DDL operations, we need to rely on exception to figure
+ # that out at the moment.
+ try:
+ if records == -1:
+ res = cur.fetchall()
+ else:
+ res = cur.fetchmany(records)
+ for row in res:
+ new_row = []
+ for col in self.column_info:
+ new_row.append(row[col['name']])
+ result.append(new_row)
+ except psycopg2.ProgrammingError as e:
+ result = None
+ else:
+ # User performed operation which dose not produce record/s as
+ # result.
+ # for eg. DDL operations.
+ return True, None
+
+ return True, result
+
def connected(self):
if self.conn:
if not self.conn.closed:
@@ -1119,7 +1168,7 @@ Failed to reset the connection to the server due to following error:
"poll() returned %s from _wait_timeout function" % state
)
- def poll(self, formatted_exception_msg=False):
+ def poll(self, formatted_exception_msg=False, no_result=False):
"""
This function is a wrapper around connection's poll function.
It internally uses the _wait_timeout method to poll the
@@ -1129,6 +1178,7 @@ Failed to reset the connection to the server due to following error:
Args:
formatted_exception_msg: if True then function return the formatted
exception message, otherwise error string.
+ no_result: If True then only poll status will be returned.
"""
cur = self.__async_cursor
@@ -1184,23 +1234,23 @@ Failed to reset the connection to the server due to following error:
pos += 1
self.row_count = cur.rowcount
-
- if cur.rowcount > 0:
- result = []
- # For DDL operation, we may not have result.
- #
- # Because - there is not direct way to differentiate DML and
- # DDL operations, we need to rely on exception to figure that
- # out at the moment.
- try:
- for row in cur:
- new_row = []
- for col in self.column_info:
- new_row.append(row[col['name']])
- result.append(new_row)
-
- except psycopg2.ProgrammingError:
- result = None
+ if not no_result:
+ if cur.rowcount > 0:
+ result = []
+ # For DDL operation, we may not have result.
+ #
+ # Because - there is not direct way to differentiate DML and
+ # DDL operations, we need to rely on exception to figure
+ # that out at the moment.
+ try:
+ for row in cur:
+ new_row = []
+ for col in self.column_info:
+ new_row.append(row[col['name']])
+ result.append(new_row)
+
+ except psycopg2.ProgrammingError:
+ result = None
return status, result
--
Sent via pgadmin-hackers mailing list ([email protected])
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgadmin-hackers