Chunk the file in a shell script, add not previously seen URL lines to
pocket with their api (http://getpocket.com/developer/docs/v3/add) and
a copy of cURL, then subscribe to that pocket feed with whatever tools
you already use?

You could even just preprocess them into files that are dated in
YY-MM-DD format, have a daily cron that runs and assesses, "is there a
file for today's date", and then does the api calls to pocket.

I think the RSS here isn't the hard part of the problem, it's knowing
which tools you want to select to do the processing.  :)   and
avoiding being 'fancy'...

--e



On Fri, Aug 7, 2015 at 7:05 AM, Craig Constantine
<[email protected]> wrote:
> I have tools/setup for following RSS feeds that already works well for me.
>
> It's the "turn a list of thousands of URLs" into an RSS feed that I'm trying 
> to figure out. Without writing something myself, the only solution I've 
> thought of is to use a WordPress site and a plugin that will bulk create 
> posts, scheduled in the future, from a CSV file. So I'd write a script that 
> added a post-on date column to the URLs and then load them... but that's 
> crazy clunky to run an entire WP install just to get an RSS feed...
>
> -- Craig Constantine, http://constantine.name
>
>
> On Aug 6, 2015, at 8:04 PM, Ed <[email protected]> wrote:
>
> create your own RSS feed of each page (control each with cron?) and
> feed them into PlanetPlanet?  http://www.planetplanet.org/
>
> On Thu, Aug 6, 2015 at 1:24 PM, Craig Constantine
> <[email protected]> wrote:
>> Anyone know of a way to turn a "large" list of URLs into an RSS feed?
>>
>> No, I'd prefer not to bulk load the URLs into (for example) my own web site 
>> as some sort of future-scheduled posts. (and then follow my own RSS feed.) 
>> I'm thinking of something like Readability or Instapaper that would let me 
>> bulk load them and then feed them out as a plain-jane RSS feed.
>>
>> Yes, I could totally write it from scratch. But I'm looking for a way to not 
>> have to reinvent the wheel.
>>
>> Use case?
>>
>> I want to read all of a large web site (think ~1,000 pages.) I only need to 
>> glance at the page to know if I care enough to really read it. So I thought 
>> of an RSS feed that "drips" one URL every specified number of hours.
>>
>> -- Craig Constantine, http://constantine.name
>>
>> _______________________________________________
>> Discuss mailing list
>> [email protected]
>> https://lists.lopsa.org/cgi-bin/mailman/listinfo/discuss
>> This list provided by the League of Professional System Administrators
>> http://lopsa.org/
>
> _______________________________________________
> Discuss mailing list
> [email protected]
> https://lists.lopsa.org/cgi-bin/mailman/listinfo/discuss
> This list provided by the League of Professional System Administrators
>  http://lopsa.org/
_______________________________________________
Discuss mailing list
[email protected]
https://lists.lopsa.org/cgi-bin/mailman/listinfo/discuss
This list provided by the League of Professional System Administrators
 http://lopsa.org/

Reply via email to