Might I ask once more whether there’s a get-around for capturing real-time or, 
failing that,  close of play quotes from Yahoo?

Thanks,

Mike

Sent from my iPad

> On 26 Nov 2019, at 22:11, Gilles Kirouac <g1...@myriade.ca> wrote:
> 
> Thomas
>   To avoid the dependency on Linux, without changing your syntax, I
> redefined epochtime as:
> 
> NB.create a time stamp [of type character]
> NB.   epochtime'1/1/2019'
> NB.1546318800
> epochtime =: 3 : 0
> ":18000+86400*62091-~todayno _1|.".;._1 '/',y NB.normalized yr mn dy
> )
> 
>   and I removed the two curtails in  (using a fixed width font)
> 
> qstr =. '?period1=',(}:epochtime d1),'&period2=',(}:epochtime
> d2),'&interval=1d&events=history&crumb=',crumb
> .....................^^...........................^^
> 
>   todayno is in the standard library.
> 
>   Thanks!
> 
>   ~ Gilles
>> Le 2019-11-24 à 23:19, Thomas McGuire a écrit :
>> OK so here is my final code cleaned up and now working due to the double 
>> quote issue (see second to last line of code):
>> 
>> NB. Navigating yahoo.com to programmatically get historical stock prices
>> NB.
>> require 'web/gethttp'
>> require 'regex'
>> 
>> NB. use the linux date command to create a linux time stamp
>> epochtime =: 3 : 0
>> 2!:0 'date -jf ''%m/%d/%Y %H:%M:%S %p'' ''',y,' 05:00:00 PM'' ''+%s'''
>> )
>> 
>> NB. precision functions
>> ppq =: 9 !: 10 NB. print-precision query
>> pps =: 9 !: 11 NB. print-precision set
>> NB. I set the precision to 16 to ensure full printing of the linux timestamps
>> 
>> NB. Conversion of \u00xx escape sequences
>> HEX=:16#.'0123456789abcdef'i.]
>> 
>> xutf =: 3 : 0
>> u: HEX tolower 2 }. y
>> )
>> 
>> crumbstr =: '"CrumbStore":{"crumb":"'
>> NB. the crumb is on the page with the link to downloading the historical
>> NB. data. If you call the correct first page you only need to search
>> NB. for the above crumbstr there will be only one.
>> getcrumb =: 3 : 0
>> NB. find the start index and end index of the crumb
>> sidx =. (#crumbstr)+({: I. crumbstr E. y)
>> sstr =. (sidx + i. 30){y
>> eidx =. {. I. '"' E. sstr
>> 
>> NB. using rxapply convert all \u00xx unicode escape sequences
>> crumb =. '(\\u[0-9a-fA-F][0-9a-fA-F][0-9a-fA-F][0-9a-fA-F])' xutf rxapply 
>> (i.eidx){sstr
>> )
>> 
>> financeURL =: 'https://finance.yahoo.com/quote/' NB. AAPL/history?p=AAPL'
>> histURL =: 'https://query1.finance.yahoo.com/v7/finance/download/'
>> NB. the histURL needs to have a ticker symbol followed by:
>> NB. ?period1=<unixts for p1>&period2=<unixts for 
>> p2&interval=1d&events=history&crumb=<crumbval>
>> NB.
>> NB. here is a full fledged quote request from the website itself for Apple 
>> Computer
>> NB. 
>> https://query1.finance.yahoo.com/v7/finance/download/AAPL?period1=1543024670&period2=1574560670&interval=1d&events=history&crumb=jZO816Y7CSK
>> 
>> gethistorical=: 3 : 0
>> 'symbol d1 d2' =. y
>> 
>> NB. Create start URL for the start page with the crumb to get historical 
>> download
>> NB. a BASH implementation uses the following format:
>> NB. sURL =. financeURL,symbol,'/?p=',symbol
>> 
>> NB. But the link to the download of historical prices is:
>> sURL =. financeURL,symbol,'/history?p=',symbol
>> 
>> NB. Get the response using gethttp. -c cookie.txt will open a cookie file
>> res =. '-s -c cookie.txt' gethttp sURL
>> 
>> crumb =. getcrumb res
>> 
>> qstr =. '?period1=',(}:epochtime d1),'&period2=',(}:epochtime 
>> d2),'&interval=1d&events=history&crumb=',crumb
>> URL=. histURL,symbol,qstr
>> 
>> NB. turns out that to get a file download you need to double quote the URL
>> NB. There is a built in function for that in J
>> res2 =. '-s -b cookie.txt ' gethttp dquote URL
>> res2
>> )
>> 
>> ----------------------------------------------------------------------
>> For information about J forums see http://www.jsoftware.com/forums.htm
> 
> ----------------------------------------------------------------------
> For information about J forums see http://www.jsoftware.com/forums.htm
----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm

Reply via email to