I'll just have to start Liferea and it will crash a few seconds later.

"liferea --debug-all" gave me this:

CONF: proxy auto detect is configured
CONF: no proxy GNOME of $http_proxy configuration found...
CONF: Proxy settings are now NULL:0 NULL:NULL
NET: proxy set to http://(null):(null)@(null):0
UPDATE: network manager: registering network state change callback
DB: Opening DB file /home/johan/.liferea_1.6/liferea.db...
DB: current DB schema version: 7
DB: executing SQL: CREATE TABLE items (   title         TEXT,   read            
INTEGER,   new          INTEGER,   updated              INTEGER,   popup        
        INTEGER,   marked               INTEGER,   source               TEXT,   
source_id               TEXT,   valid_guid              INTEGER,   
real_source_url      TEXT,   real_source_title       TEXT,   description     
TEXT,   date            INTEGER,   comment_feed_id      TEXT,   comment         
   INTEGER);
DB:  -> result: 1 (table items already exists)
DB: executing SQL: CREATE INDEX items_idx ON items (source_id);
DB:  -> result: 1 (index items_idx already exists)
DB: executing SQL: CREATE INDEX items_idx2 ON items (comment_feed_id);
DB:  -> result: 1 (index items_idx2 already exists)
DB: executing SQL: CREATE TABLE itemsets (   item_id            INTEGER,   
parent_item_id     INTEGER,   node_id                TEXT,   parent_node_id     
TEXT,   read         INTEGER,   comment            INTEGER,   PRIMARY KEY 
(item_id, node_id));
DB:  -> result: 1 (table itemsets already exists)
DB: executing SQL: CREATE INDEX itemset_idx  ON itemsets (node_id);
DB:  -> result: 1 (index itemset_idx already exists)
DB: executing SQL: CREATE INDEX itemset_idx2 ON itemsets (item_id);
DB:  -> result: 1 (index itemset_idx2 already exists)
DB: executing SQL: CREATE TABLE metadata (   item_id            INTEGER,   nr   
                INTEGER,   key                  TEXT,   value                   
TEXT,   PRIMARY KEY (item_id, nr));
DB:  -> result: 1 (table metadata already exists)
DB: executing SQL: CREATE INDEX metadata_idx ON metadata (item_id);
DB:  -> result: 1 (index metadata_idx already exists)
DB: executing SQL: CREATE TABLE attention_stats (   category_id TEXT,   
category_name   TEXT,   count              INTEGER,   PRIMARY KEY 
(category_id));
DB:  -> result: 1 (table attention_stats already exists)
DB: executing SQL: CREATE TABLE subscription (   node_id            STRING,   
source             STRING,   orig_source        STRING,   filter_cmd         
STRING,   update_interval    INTEGER,   default_interval   INTEGER,   
discontinued       INTEGER,   available          INTEGER,   PRIMARY KEY 
(node_id));
DB:  -> result: 1 (table subscription already exists)
DB: executing SQL: CREATE TABLE update_state (   node_id            STRING,   
last_modified      STRING,   etag               STRING,   last_update        
INTEGER,   last_favicon_update INTEGER,   PRIMARY KEY (node_id));
DB:  -> result: 1 (table update_state already exists)
DB: executing SQL: CREATE TABLE subscription_metadata (   node_id            
STRING,   nr                 INTEGER,   key                TEXT,   value        
      TEXT,   PRIMARY KEY (node_id, nr));
DB:  -> result: 1 (table subscription_metadata already exists)
DB: executing SQL: CREATE INDEX subscription_metadata_idx ON 
subscription_metadata (node_id);
DB:  -> result: 1 (index subscription_metadata_idx already exists)
DB: executing SQL: CREATE TABLE node (   node_id                STRING,   
parent_id             STRING,   title         STRING,   type          INTEGER,  
 expanded           INTEGER,   view_mode              INTEGER,   sort_column  
INTEGER,   sort_reversed        INTEGER,   PRIMARY KEY (node_id));
DB:  -> result: 1 (table node already exists)
DB: executing SQL: CREATE TABLE view_state (   node_id            STRING,   
unread             INTEGER,   count              INTEGER,   PRIMARY KEY 
(node_id));
DB:  -> result: 1 (table view_state already exists)
DB: table setup took 0,002s
DB: executing SQL: DROP TRIGGER item_insert;
DB:  -> result: 0 (success)
DB: executing SQL: DROP TRIGGER item_update;
DB:  -> result: 0 (success)
DB: executing SQL: DROP TRIGGER item_removal;
DB:  -> result: 0 (success)
DB: executing SQL: DROP TRIGGER subscription_removal;
DB:  -> result: 0 (success)
DB: Checking for items not referenced in table 'itemsets'...
DB: executing SQL: BEGIN;    CREATE TEMP TABLE tmp_id ( id );   INSERT INTO 
tmp_id SELECT ROWID FROM items WHERE ROWID NOT IN (SELECT item_id FROM 
itemsets);   DELETE FROM items WHERE ROWID IN (SELECT id FROM tmp_id LIMIT 
1000);   DROP TABLE tmp_id;END;
DB:  -> result: 0 (success)
DB: Checking for invalid item ids in table 'itemsets'...
DB: executing SQL: BEGIN;    CREATE TEMP TABLE tmp_id ( id );   INSERT INTO 
tmp_id SELECT item_id FROM itemsets WHERE item_id NOT IN (SELECT ROWID FROM 
items);   DELETE FROM itemsets WHERE item_id IN (SELECT id FROM tmp_id LIMIT 
1000);   DROP TABLE tmp_id;END;
DB:  -> result: 0 (success)
DB: Checking for items without a feed list node...

DB: executing SQL: DELETE FROM itemsets WHERE comment = 0 AND node_id NOT IN 
(SELECT node_id FROM node);
DB:  -> result: 0 (success)
DB: Checking for stale views not listed in feed list.
DB: DB cleanup finished. Continuing startup.
DB: executing SQL: CREATE TRIGGER item_insert INSERT ON items BEGIN    UPDATE 
itemsets SET read = new.read    WHERE item_id = new.ROWID; END;
DB:  -> result: 0 (success)
DB: executing SQL: CREATE TRIGGER item_update UPDATE ON items BEGIN    UPDATE 
itemsets SET read = new.read    WHERE item_id = new.ROWID; END;
DB:  -> result: 0 (success)
DB: executing SQL: CREATE TRIGGER item_removal DELETE ON itemsets BEGIN    
DELETE FROM items WHERE ROWID = old.item_id;    DELETE FROM metadata WHERE 
item_id = old.item_id; END;
DB:  -> result: 0 (success)
DB: executing SQL: CREATE TRIGGER subscription_removal DELETE ON subscription 
BEGIN    DELETE FROM node WHERE node_id = old.node_id;    DELETE FROM 
update_state WHERE node_id = old.node_id;    DELETE FROM subscription_metadata 
WHERE node_id = old.node_id; END;
DB:  -> result: 0 (success)
PERF: db_init took 0,749s
PERF: function "db_init" is slow! Took 749ms.
PLUGINS: Scanning for plugins (/usr/lib/liferea):
PLUGINS: -> LUA Scripting Support Plugin (libliscrlua.so, type=3)
PLUGINS: -> libnotify notification (liblinotiflibnotify.so, type=2)
PLUGINS: using "libnotify" for notification type 0
PLUGINS: using "LUA Scripting Support Plugin" for scripting...
GUI: Unknown social bookmarking site ""!
GUI: Unknown link cosmos search site ""!
CACHE: Compiled without AVAHI support
GUI: Setting up menues
GUI: Setting up widget containers
GUI: Setting up status bar
GUI: Setting up tabbed browsing
GUI: Setting up feed list
PERF: ui_feedlist_init took 0,001s
GUI: Initialising menues
GUI: Setting up item view
GUI: using default date format
GUI: Loading icons
CACHE: Setting up root node
CACHE: Importing OPML file: /usr/share/liferea/opml/feedlist.opml
CACHE: -> must be a folder
GUI: adding node "Example Feeds" as child of parent="root"
GUI: folder empty check for node id "lbumisg"
GUI: folder empty check for node id "vxinkds"
CACHE: -> must be a folder
GUI: adding node "Open Source" as child of parent="Example Feeds"
GUI: folder empty check for node id "vxinkds"
GUI: folder empty check for node id "prglbtf"
CACHE: -> URL found assuming type feed
GUI: adding node "Planet Ubuntu" as child of parent="Open Source"
GUI: folder empty check for node id "prglbtf"
DB: loading subscription woalkrh update state (thread=0x204af20)
DB: Could not load update state for subscription woalkrh (error code 101)!
CACHE: import feed: title=Planet Ubuntu 
source=http://planet.ubuntu.com/rss20.xml typeStr=(null) interval=-1
CACHE: seems to be an import, setting new id: woalkrh and doing first 
download...
UPDATE: Scheduling Planet Ubuntu to be updated
DB: saving subscription woalkrh update state (thread=0x204af20)
DB: update state save took 0,075s
UPDATE: Resetting last poll counter to 1267916400.
DB: updating node info woalkrh (thread 0x204af20)
DB: subscription_update took 0,083s
PERF: import_parse_outline took 0,158s
CACHE: -> URL found assuming type feed
GUI: adding node "The Ubuntu Fridge" as child of parent="Open Source"
GUI: folder empty check for node id "prglbtf"
DB: loading subscription wwuloqb update state (thread=0x204af20)
DB: Could not load update state for subscription wwuloqb (error code 101)!
CACHE: import feed: title=The Ubuntu Fridge 
source=http://fridge.ubuntu.com/node/feed typeStr=(null) interval=-1
CACHE: seems to be an import, setting new id: wwuloqb and doing first 
download...
UPDATE: Scheduling The Ubuntu Fridge to be updated
DB: saving subscription wwuloqb update state (thread=0x204af20)
DB: update state save took 0,091s
UPDATE: Resetting last poll counter to 1267916400.
DB: updating node info wwuloqb (thread 0x204af20)
DB: subscription_update took 0,083s
PERF: import_parse_outline took 0,174s
CACHE: -> URL found assuming type feed
GUI: adding node "Debian Package a Day" as child of parent="Open Source"
GUI: folder empty check for node id "prglbtf"
DB: loading subscription yufujyn update state (thread=0x204af20)
DB: Could not load update state for subscription yufujyn (error code 101)!
CACHE: import feed: title=Debian Package a Day 
source=http://debaday.debian.net/feed/atom/ typeStr=(null) interval=-1
CACHE: seems to be an import, setting new id: yufujyn and doing first 
download...
UPDATE: Scheduling Debian Package a Day to be updated
DB: saving subscription yufujyn update state (thread=0x204af20)
DB: update state save took 0,074s
UPDATE: Resetting last poll counter to 1267916401.
DB: updating node info yufujyn (thread 0x204af20)
DB: subscription_update took 0,074s
PERF: import_parse_outline took 0,149s
CACHE: -> URL found assuming type feed
GUI: adding node "Gnomefiles" as child of parent="Open Source"
GUI: folder empty check for node id "prglbtf"
DB: loading subscription qbaqnuw update state (thread=0x204af20)
DB: Could not load update state for subscription qbaqnuw (error code 101)!
CACHE: import feed: title=Gnomefiles 
source=http://www.gnomefiles.org/files/gnomefiles.rdf typeStr=(null) interval=-1
CACHE: seems to be an import, setting new id: qbaqnuw and doing first 
download...
UPDATE: Scheduling Gnomefiles to be updated
DB: saving subscription qbaqnuw update state (thread=0x204af20)
DB: update state save took 0,081s
UPDATE: Resetting last poll counter to 1267916401.
DB: updating node info qbaqnuw (thread 0x204af20)
DB: subscription_update took 0,093s
PERF: import_parse_outline took 0,174s
CACHE: -> URL found assuming type feed
GUI: adding node "GNOME Footnotes" as child of parent="Open Source"
GUI: folder empty check for node id "prglbtf"
DB: loading subscription yrhihgg update state (thread=0x204af20)
DB: Could not load update state for subscription yrhihgg (error code 101)!
CACHE: import feed: title=GNOME Footnotes 
source=http://www.gnomedesktop.org/backend.php typeStr=(null) interval=-1
CACHE: seems to be an import, setting new id: yrhihgg and doing first 
download...
UPDATE: Scheduling GNOME Footnotes to be updated
DB: saving subscription yrhihgg update state (thread=0x204af20)
DB: update state save took 0,066s
UPDATE: Resetting last poll counter to 1267916401.
DB: updating node info yrhihgg (thread 0x204af20)
DB: subscription_update took 0,074s
PERF: import_parse_outline took 0,141s
CACHE: -> URL found assuming type feed
GUI: adding node "GrokLaw" as child of parent="Open Source"
GUI: folder empty check for node id "prglbtf"
DB: loading subscription muflfat update state (thread=0x204af20)
DB: Could not load update state for subscription muflfat (error code 101)!
CACHE: import feed: title=GrokLaw 
source=http://www.groklaw.net/backend/GrokLaw.rdf typeStr=(null) interval=-1
CACHE: seems to be an import, setting new id: muflfat and doing first 
download...
UPDATE: Scheduling GrokLaw to be updated
DB: saving subscription muflfat update state (thread=0x204af20)
DB: update state save took 0,080s
UPDATE: Resetting last poll counter to 1267916401.
DB: updating node info muflfat (thread 0x204af20)
DB: subscription_update took 0,074s
PERF: import_parse_outline took 0,156s
CACHE: -> URL found assuming type feed
GUI: adding node "Liferea Blog" as child of parent="Open Source"
GUI: folder empty check for node id "prglbtf"
DB: loading subscription uivftxr update state (thread=0x204af20)
DB: Could not load update state for subscription uivftxr (error code 101)!
CACHE: import feed: title=Liferea Blog 
source=http://liferea.blogspot.com/feeds/posts/default typeStr=(null) 
interval=-1
CACHE: seems to be an import, setting new id: uivftxr and doing first 
download...
UPDATE: Scheduling Liferea Blog to be updated
DB: saving subscription uivftxr update state (thread=0x204af20)
DB: update state save took 0,060s
UPDATE: Resetting last poll counter to 1267916401.
DB: updating node info uivftxr (thread 0x204af20)
DB: subscription_update took 0,075s
PERF: import_parse_outline took 0,135s
CACHE: -> URL found assuming type feed
GUI: adding node "mozillaZine" as child of parent="Open Source"
GUI: folder empty check for node id "prglbtf"
DB: loading subscription opwbveq update state (thread=0x204af20)
DB: Could not load update state for subscription opwbveq (error code 101)!
CACHE: import feed: title=mozillaZine 
source=http://www.mozillazine.org/atom.xml typeStr=(null) interval=-1
CACHE: seems to be an import, setting new id: opwbveq and doing first 
download...
UPDATE: Scheduling mozillaZine to be updated
DB: saving subscription opwbveq update state (thread=0x204af20)
DB: update state save took 0,080s
UPDATE: Resetting last poll counter to 1267916401.
DB: updating node info opwbveq (thread 0x204af20)
DB: subscription_update took 0,060s
PERF: import_parse_outline took 0,141s
CACHE: -> URL found assuming type feed
GUI: adding node "Planet GNOME" as child of parent="Open Source"
GUI: folder empty check for node id "prglbtf"
DB: loading subscription scbwpsa update state (thread=0x204af20)
DB: Could not load update state for subscription scbwpsa (error code 101)!
CACHE: import feed: title=Planet GNOME source=http://planet.gnome.org/atom.xml 
typeStr=(null) interval=-1
CACHE: seems to be an import, setting new id: scbwpsa and doing first 
download...
UPDATE: Scheduling Planet GNOME to be updated
DB: saving subscription scbwpsa update state (thread=0x204af20)
DB: update state save took 0,072s
UPDATE: Resetting last poll counter to 1267916401.
DB: updating node info scbwpsa (thread 0x204af20)
DB: subscription_update took 0,060s
PERF: import_parse_outline took 0,133s
CACHE: -> URL found assuming type feed
GUI: adding node "Slashdot" as child of parent="Open Source"
GUI: folder empty check for node id "prglbtf"
DB: loading subscription fqkylkw update state (thread=0x204af20)
DB: Could not load update state for subscription fqkylkw (error code 101)!
CACHE: import feed: title=Slashdot source=http://slashdot.org/slashdot.rss 
typeStr=(null) interval=-1
CACHE: seems to be an import, setting new id: fqkylkw and doing first 
download...
UPDATE: Scheduling Slashdot to be updated
DB: saving subscription fqkylkw update state (thread=0x204af20)
DB: update state save took 0,080s
UPDATE: Resetting last poll counter to 1267916402.
DB: updating node info fqkylkw (thread 0x204af20)
DB: subscription_update took 0,107s
PERF: import_parse_outline took 0,189s
CACHE: -> URL found assuming type feed
GUI: adding node "TuxMobil" as child of parent="Open Source"
GUI: folder empty check for node id "prglbtf"
DB: loading subscription vwbjbjl update state (thread=0x204af20)
DB: Could not load update state for subscription vwbjbjl (error code 101)!
CACHE: import feed: title=TuxMobil source=http://tuxmobil.org/tuxmobil_rss.rdf 
typeStr=(null) interval=-1
CACHE: seems to be an import, setting new id: vwbjbjl and doing first 
download...
UPDATE: Scheduling TuxMobil to be updated
DB: saving subscription vwbjbjl update state (thread=0x204af20)
DB: update state save took 0,076s
UPDATE: Resetting last poll counter to 1267916402.
DB: updating node info vwbjbjl (thread 0x204af20)
DB: subscription_update took 0,074s
PERF: import_parse_outline took 0,152s
CACHE: seems to be an import, setting new id: prglbtf and doing first 
download...
DB: updating node info prglbtf (thread 0x204af20)
DB: subscription_update took 0,074s
PERF: import_parse_outline took 0,227s
CACHE: -> must be a folder
GUI: adding node "Podcasts" as child of parent="Example Feeds"
GUI: folder empty check for node id "vxinkds"
GUI: folder empty check for node id "mdqygdw"
CACHE: -> URL found assuming type feed
GUI: adding node "EscapePod" as child of parent="Podcasts"
GUI: folder empty check for node id "mdqygdw"
DB: loading subscription jluteec update state (thread=0x204af20)
DB: Could not load update state for subscription jluteec (error code 101)!
CACHE: import feed: title=EscapePod source=http://escapepod.org/podcast.xml 
typeStr=(null) interval=-1
CACHE: seems to be an import, setting new id: jluteec and doing first 
download...
UPDATE: Scheduling EscapePod to be updated
DB: saving subscription jluteec update state (thread=0x204af20)
DB: update state save took 0,130s
UPDATE: Resetting last poll counter to 1267916402.
DB: updating node info jluteec (thread 0x204af20)
DB: subscription_update took 0,249s
PERF: import_parse_outline took 0,380s
PERF: function "import_parse_outline" is slow! Took 380ms.
CACHE: seems to be an import, setting new id: mdqygdw and doing first 
download...
DB: updating node info mdqygdw (thread 0x204af20)
DB: subscription_update took 0,074s
PERF: import_parse_outline took 0,455s
PERF: function "import_parse_outline" is slow! Took 455ms.
CACHE: seems to be an import, setting new id: vxinkds and doing first 
download...
DB: updating node info vxinkds (thread 0x204af20)
DB: subscription_update took 0,060s
PERF: import_parse_outline took 0,516s
PERF: function "import_parse_outline" is slow! Took 516ms.
GUI: adding node "Unread" as child of parent="root"
GUI: folder empty check for node id "lbumisg"
CACHE: import vfolder: title=Unread
CACHE: loading rule "unread" ""
DB: Checking for view lmlqbcf (SQL=CREATE VIEW view_lmlqbcf AS SELECT 
items.ROWID AS item_id,items.read AS item_read FROM items   WHERE (items.read = 
0) AND items.comment != 1)
DB: Dropping trigger failed (no such trigger: view_lmlqbcf_insert_before) SQL: 
DROP TRIGGER view_lmlqbcf_insert_before;
DB: Dropping trigger failed (no such trigger: view_lmlqbcf_insert_after) SQL: 
DROP TRIGGER view_lmlqbcf_insert_after;
DB: Dropping trigger failed (no such trigger: view_lmlqbcf_delete) SQL: DROP 
TRIGGER view_lmlqbcf_delete;
DB: Dropping trigger failed (no such trigger: view_lmlqbcf_update_before) SQL: 
DROP TRIGGER view_lmlqbcf_update_before;
DB: Dropping trigger failed (no such trigger: view_lmlqbcf_update_after) SQL: 
DROP TRIGGER view_lmlqbcf_update_after;
DB: executing SQL: REPLACE INTO view_state (node_id, unread, count) VALUES 
('lmlqbcf',    (SELECT count(*) FROM view_lmlqbcf WHERE item_read = 0),   
(SELECT count(*) FROM view_lmlqbcf));
DB:  -> result: 0 (success)
CACHE: seems to be an import, setting new id: lmlqbcf and doing first 
download...
DB: updating node info lmlqbcf (thread 0x204af20)
DB: subscription_update took 0,093s
PERF: import_parse_outline took 0,705s
PERF: function "import_parse_outline" is slow! Took 705ms.
GUI: adding node "Important" as child of parent="root"
GUI: folder empty check for node id "lbumisg"
CACHE: import vfolder: title=Important
CACHE: loading rule "flagged" ""
DB: Checking for view aktotxr (SQL=CREATE VIEW view_aktotxr AS SELECT 
items.ROWID AS item_id,items.read AS item_read FROM items   WHERE (items.marked 
= 1) AND items.comment != 1)
DB: Dropping trigger failed (no such trigger: view_aktotxr_insert_before) SQL: 
DROP TRIGGER view_aktotxr_insert_before;
DB: Dropping trigger failed (no such trigger: view_aktotxr_insert_after) SQL: 
DROP TRIGGER view_aktotxr_insert_after;
DB: Dropping trigger failed (no such trigger: view_aktotxr_delete) SQL: DROP 
TRIGGER view_aktotxr_delete;
DB: Dropping trigger failed (no such trigger: view_aktotxr_update_before) SQL: 
DROP TRIGGER view_aktotxr_update_before;
DB: Dropping trigger failed (no such trigger: view_aktotxr_update_after) SQL: 
DROP TRIGGER view_aktotxr_update_after;
DB: executing SQL: REPLACE INTO view_state (node_id, unread, count) VALUES 
('aktotxr',    (SELECT count(*) FROM view_aktotxr WHERE item_read = 0),   
(SELECT count(*) FROM view_aktotxr));
DB:  -> result: 0 (success)
CACHE: seems to be an import, setting new id: aktotxr and doing first 
download...
DB: updating node info aktotxr (thread 0x204af20)
DB: subscription_update took 0,076s
PERF: import_parse_outline took 0,716s
PERF: function "import_parse_outline" is slow! Took 716ms.
PERF: default_source_source_import took 3,756s
PERF: function "default_source_source_import" is slow! Took 3756ms.
PERF: node_source_setup_root took 3,756s
PERF: function "node_source_setup_root" is slow! Took 3756ms.
CACHE: Initializing node state
GUI: Notification setup
UPDATE: Performing initial feed update
UPDATE: initial update: using auto update
CONF: Scheduling feedlist save
PERF: feedlist_init took 3,759s
PERF: function "feedlist_init" is slow! Took 3759ms.
GUI: Setting toolbar style
GUI: Loading accelerators
GUI: Restoring window position
GUI: Retrieved saved setting: size 1559x825 position 72:41
GUI: Restoring to size 1559x825 position 72:41
GUI: Loading pane proportions
GUI: Creating HTML widget
set zoom: 1.21
CONF: Setting /apps/liferea/last-zoomlevel to 121
GUI: Setting item list layout mode: 2
HTML: theme color "GTK-COLOR-FG" is 323232
HTML: theme color "GTK-COLOR-BG" is C1BDB5
HTML: theme color "GTK-COLOR-LIGHT" is F5F4F2
HTML: theme color "GTK-COLOR-DARK" is 8B857B
HTML: theme color "GTK-COLOR-MID" is C0BDB7
HTML: theme color "GTK-COLOR-BASE" is FFFFFF
HTML: theme color "GTK-COLOR-TEXT" is 323232
PERF: liferea_shell_create took 4,197s
PERF: function "liferea_shell_create" is slow! Took 4197ms.
GUI: Session Management: ICE initialized.
GUI: Session Management: Connecting with no previous ID
GUI: Session Management: Handling new ICE connection... 
GUI: Session Management: done.
GUI: Session Management: Connected to manager (gnome-session) with client ID 
105c168e81a203a226126791640483624300000075740294
GUI: Session Management: Using /usr/bin/liferea as command
DB: startup took 4,969s
PERF: function "startup" is slow! Took 4969ms.
UPDATE: processing request (http://planet.ubuntu.com/rss20.xml)
NET: downloading http://planet.ubuntu.com/rss20.xml
GUI: Session Management: Received first save_yourself

Liferea did receive signal 11 (Segmentation fault).
You have propably triggered a program bug. I will now try to 
create a backtrace which you can attach to any support requests.

GUI: Session Management: Received save_complete

-- 
Liferea crasches at start
https://bugs.launchpad.net/bugs/531464
You received this bug notification because you are a member of Ubuntu
Bugs, which is subscribed to Ubuntu.

-- 
ubuntu-bugs mailing list
ubuntu-bugs@lists.ubuntu.com
https://lists.ubuntu.com/mailman/listinfo/ubuntu-bugs

Reply via email to