You can also save cookies to a file named cookies.txt as follows: #curl -c cookies.txt https://login.yahoo.com -I
and then following to display 11 characters crumb by using cookies saved in previous command: #curl -H "user-agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.95 Safari/537.3" -b cookies.txt https://query2.finance.yahoo.com/v1/test/getcrumb Output of a working F::Q example with DEBUG=1 set should be similar to below ... note A1, A3, A3S and AS cookies listed and corresponding crumb value at the end.... ### [Wed Apr 17 14:31:18 2024] cookie_jar : bless( { ### COOKIES => { ### '.yahoo.com' => { ### '/' => { ### A1 => [ ### 0, ### 'd=AQABBHYVIGYCENNwdHR1aVcaemtvqQ4t4WcFEgEBAQFmIWYpZtxH0iMA_eMAAA&S=AQAAApUVRCc9YJK2Hw8hmnctLGc', ### undef, ### 1, ### 1, ### 1744936278, ### undef, ### { ### HttpOnly => undef, ### SameSite => 'Lax' ### } ### ], ### A1S => [ ### 0, ### 'd=AQABBHYVIGYCENNwdHR1aVcaemtvqQ4t4WcFEgEBAQFmIWYpZtxH0iMA_eMAAA&S=AQAAApUVRCc9YJK2Hw8hmnctLGc', ### undef, ### 1, ### 1, ### undef, ### 1, ### { ### SameSite => 'Lax' ### } ### ], ### A3 => [ ### 0, ### 'd=AQABBHYVIGYCENNwdHR1aVcaemtvqQ4t4WcFEgEBAQFmIWYpZtxH0iMA_eMAAA&S=AQAAApUVRCc9YJK2Hw8hmnctLGc', ### undef, ### 1, ### 1, ### 1744936278, ### undef, ### { ### HttpOnly => undef, ### SameSite => 'None' ### } ### ] ### } ### }, ### 'login.yahoo.com' => { ### '/' => { ### AS => [ ### 0, ### 'v=1&s=ZpPpXAWp&d=A662166f6|wy8UEd7.2SrgMZFNCyRJFuJ88rkOfc7f2KNA89u_WuZ6fXrY8xzJ07zjmdHjjnVL7tUTSw9uOSzz31wvEds3TZq7h7j7PYLUObzbyEjzTF2bq9g.9kTJ2emMatlgJP1V5BfRD7rzwA6LUYRRf3NJ2o4nE8UA9THS9pfojIhzfHE.zV82j.SFjuZaLiyo1qU7JOYKwW5rSnnrKLYDx_5LOF3xDsgPPsgB97CPalhkcGNN0Sfewon4rGTiq1xRAFC_P7moFihb9KK5ejN6TlswyksxWcImt6DpMaWxzQImS9rS4v_iM_IcHfyWSt4k6CankWWAtD0dCdmEuq5BJ8SRDpDwwjyWAPV8arKd8r9JVaDS1ODV.13EknVzp3Wi8WTpRiQYTeYhzBy0IpwYgEdFnEmFBkOXRfykYsYcQ60Dg1rUQZhWg8SJVk.ch9FNELrRVHCBaHJ6CK5d2BZPWKpFLz7NHK0POU5_rrRYjLhveOX7ISsG6GJxBSNCFuff4Aas0BbFL3mVc2X8hCzveGWCtO5TuXTkPrzJtHAVVzgQFrjQVsAVKycNM6I0lfvHNb23em30BKc7Kvc86pUxfXRBZi68mBBsCrm2isD6O9tKLgl7FovOZX4YtQAX2rNY6n.4Gen1pY5gTRCfU5iL.4gCjjmGt6Vvgtjlsvo0tXBSTUPmZvGwx97JON.atz7LDy9O2qALAskB04PnUhUXaH5thOOHFPYF92AB2w7f4j3IZMwHD.UFuWZc7EsaRG_5989fHZ9FtPyH94Ut6vqlInUkL_xXQuwmODh2RyYuCAoJCTwD~A', ### undef, ### 1, ### 1, ### undef, ### 1, ### { ### HttpOnly => undef ### } ### ] ### } ### } ### } ### }, 'HTTP::Cookies' ) ### [Wed Apr 17 14:31:18 2024] crumb : '8CJLZDzzDDN' -----Original Message----- From: Bruce Schuck <bsch...@asgard-systems.com> Sent: Wednesday, April 17, 2024 1:50 PM To: GnuCash User <gnucash-user@gnucash.org> Cc: kalpesh.pa...@usa.net; rull...@protonmail.com; adam.grif...@gmail.com Subject: Re: [GNC] Finance-Quote 1.60 released! Wed Apr 17 11:58:37 EDT 2024, Kalpesh Patel writes: > Is this happening on Windows 10 with Strawberry perl? > There is one other user is having problem getting secure cookies on > Windows 10 which is what this appears to be as well. It is related to > some Perl's SSL package being broken or not installed correctly but > haven't been able to pinpoint it. Wednesday, April 17, 2024 5:44 AM, Richard Ullger writes: > yahoojson and yahooweb are not working for me either in v1.60 and > they didn't work in 1.59_01. WRT "yahooweb", Yahoo made changes to their finance pages. The URL used for the web scrape, aside from some formatting changes that break the module, also returns a huge amount of header information (over 16k bytes when I fetch it using curl). While I have a fix for the large amount of header information, work needs to be done to change the parsing to retrieve the data (see https://github.com/finance-quote/finance-quote/issues/377). WRT "yahoo_json" (which now allows it to be called "yahoojson"), some users are reporting the same. Turning debug on within Perl has so far indicated that not all the required cookies are being returned. From using "curl" the following cookies should be retrieved: AS=v=... A1=d=... A3=d=... A1S=d=... From one user, it looks like only the "AS=v=..." cookie is returned. Those having the same issue as Richard, please execute "curl --head https://login.yahoo.com" or "curl --include https://login.yahoo.com" and look for the "set-cookie:" lines in the header. Here is a way to test outside of GnuCash on Linux and Mac. Utilizes cached package source left behind from install using "cpan". # lsb_release -a No LSB modules are available. Distributor ID: Ubuntu Description: Ubuntu 22.04.4 LTS Release: 22.04 Codename: jammy # If you have installed 1.60 more than once (perhaps different # library paths), 1.60-0 may be 1.60-1 or 1.60-2, or ... cd $HOME/.cpan/build/Finance-Quote-1.60-0 export PERL5LIB=$HOME/.cpan/build/Finance-Quote-1.60-0/lib:$PERL5LIB ./Examples/stockdump.pl yahoo_json AAPL $VAR1 = { 'AAPLlow' => '168.27', 'AAPLopen' => '171.71', 'AAPLsymbol' => 'AAPL', 'AAPLeps' => '6.42', 'AAPLvolume' => 58572152, 'AAPLisodate' => '2024-04-16', 'AAPLname' => 'AAPL (Apple Inc.)', 'AAPLdiv_yield' => '0.5501187', 'AAPLcurrency' => 'USD', 'AAPLdate' => '04/16/2024', 'AAPLhigh' => '173.755', 'AAPLtype' => 'EQUITY', 'AAPLpe' => '26.457943', 'AAPLlast' => '169.86', 'AAPLyear_range' => ' 162.8 - 199.62', 'AAPLexchange' => 'NasdaqGS', 'AAPLsuccess' => 1, 'AAPLclose' => '172.69', 'AAPLmethod' => 'yahoo_json' }; You can output debugging information by adding "DEBUG=1" before executing the stockdump.pl script. $ DEBUG=1 ./Examples/stockdump.pl yahoo_json AAPL For those still having an issue with v1.60, my guess is that you will find the "AS=" cookie in the debug output. As Kalpesh theorized, the issue could lie within another Perl module/package. On Wed, 17 Apr 2024 08:23:38 +0200, Adam Griffis writes: > However, YahooJSON is still giving me the same error. Is an API key > required for YahooJSON now? No, Yahoo is not requiring an API Key. Bruce S _______________________________________________ gnucash-user mailing list gnucash-user@gnucash.org To update your subscription preferences or to unsubscribe: https://lists.gnucash.org/mailman/listinfo/gnucash-user ----- Please remember to CC this list on all your replies. You can do this by using Reply-To-List or Reply-All.