2 sites that don't!

2016-01-26 Thread Richard Torrens (lists)
http://www.catbehaviourist.com/sure-feed-microchip-pet-feeder-review/

http://www.zooplus.co.uk/shop/cats/cat_bowls_feeders/feeders/programmable/479534?gclid=CImfsqu3x8oCFUORGwodHCUFYQ

Both display very badly. First site all text is hidden. The source is very
over-complicated, so I can't see why.

-- 
Richard Torrens.
http://www.Torrens.org.uk for genealogy, natural history, wild food, walks, cats
and more!



Re: Big push on testing needed

2016-01-26 Thread lists
In article <3bc72c4755.davem...@my.inbox.com>,
   Dave Higton  wrote:
> Please, everybody, download the latest test build (which will,
> of course, change as bugs are found and fixed), give it a good
> thrashing, and get your bug reports in.

A site I use a lot is

http://cpc.farnell.com/

Although it displays quite nicely, the "search" box, which should be within
the broad blue banner at the top, to the right of "All products", isn't
visible - is this fixable?

#3312

-- 
Stuart Winsor

Tools With A Mission
sending tools across the world
http://www.twam.co.uk/



Re: Latest builds missing.

2016-01-26 Thread Peter Young
On 26 Jan 2016  David Pitt  wrote:

> Currently the Changes page shows #3312 as the latest successful build but
> the Downloads page only goes up to #3307 for all builds.

> http://ci.netsurf-browser.org/jenkins/job/netsurf/changes

> http://ci.netsurf-browser.org/builds/riscos/

I've just downloaded #3312 using Frank's version of Fetch_NS.

Best wishes,

Peter.

-- 
Peter Young (zfc Os) and family
Prestbury, Cheltenham, Glos. GL52, England
http://pnyoung.orpheusweb.co.uk
pnyo...@ormail.co.uk



Re: Latest builds missing.

2016-01-26 Thread Michael Drake



On 26/01/16 07:49, David Pitt wrote:

Currently the Changes page shows #3312 as the latest successful build but
the Downloads page only goes up to #3307 for all builds.


The downloads are still there, but "json" has been removed from the
filenames.  http://ci.netsurf-browser.org/builds/riscos/?C=M;O=D

--
Michael Drake  http://www.netsurf-browser.org/



Re: Latest builds missing.

2016-01-26 Thread David Pitt
Michael Drake, on 26 Jan, wrote:

> 
> 
> On 26/01/16 07:49, David Pitt wrote:
> > Currently the Changes page shows #3312 as the latest successful build
> > but the Downloads page only goes up to #3307 for all builds.
> 
> The downloads are still there, but "json" has been removed from the
> filenames.  http://ci.netsurf-browser.org/builds/riscos/?C=M;O=D

So they are, just at the top of the page now.

Thanks.
-- 
David Pitt



Re: 2 sites that don't!

2016-01-26 Thread Richard Torrens (lists)
In article ,
   Grahame Parish  wrote:
> Not related to the problem with the sites, but I confirm that the 
> feeders work very well.  We have several cats with different dietary  
> requirements.

Thanks - but any other discussion, off-list I think!

-- 
Richard Torrens.
http://www.Torrens.org.uk for genealogy, natural history, wild food, walks, cats
and more!



Re: 2 sites that don't!

2016-01-26 Thread Richard Torrens (lists)
In article <145dff4755.ga...@wra1th.plus.com>,
   Gavin Wraith  wrote:
> I have a useful little Lua script, called noscript, for use with
> StrongED that strips out all the stuff between matching 
> and  tags. Here it is:

Useful: I have on occasion loked for the tags and deleted everything
between - it can be done with a StrongED search and replace. But the
script is easier.

-- 
Richard Torrens.
http://www.Torrens.org.uk for genealogy, natural history, wild food, walks, cats
and more!



Re: 2 sites that don't!

2016-01-26 Thread Jim Nagel
Gavin Wraith  wrote on 26 Jan:

> Using it on the text of the first webpage produces source for a leaner
> webpage that reveals the hidden text. Here is the procedure:
> 1. Page->View Source in NetSurf
> 2. Save to some scrap directory.
> 3. Shift-click to load into StrongED
> 4. Shift-Drag noscript to apply icon.
> 5. Save the result and change its type to HTML.

Did that on the first web page.  Got a security warning from StrongEd 
about "process"; I said "allow always".
Then "File name '.in' not recognised".

Maybe my version of Lua is too old?  !BOOT.Resources.!Lua.version says 
5.70 (2013-08-16).

Potentially sounds like a very welcome utility.

Have often wished for a similar thing to override bothersome CSS, 
particularly CSS that specifies tiny grey text (which seems to be a 
current fashion among some designers with younger eyes than mine).

-- 
Jim Nagelwww.archivemag.co.uk



Filtering webpages

2016-01-26 Thread Gavin Wraith
It is some years ago that Fred Graute added scriptability
to StrongED, using the 'apply' icon. It enables one to
alter what is displayed in a StrongED window by dragging
a script to that icon.

There are often occasions when one would like to massage
webpages displayed in a browser. At present one has
to save out the displayed page's textual source to
StrongED, drag in the script to do the massaging,
save the result as an HTML file and then display that
result in the browser. It would be nice if this process
could be simplified and done entirely within the browser,
without having to-and-fro between browser and StrongED.
A typically useful script is one that removes all the
source starting 

Re: 2 sites that don't!

2016-01-26 Thread cj
In article <09d3064855@abbeypress.net>,
   Jim Nagel  wrote:
> Have often wished for a similar thing to override bothersome CSS, 
> particularly CSS that specifies tiny grey text (which seems to be a 
> current fashion among some designers with younger eyes than mine).

... and what about the latest issue of Archive  where, on page 3 you
have white text on a light ngrey background???

-- 
Chris Johnson



Re: Filtering webpages

2016-01-26 Thread Richard Torrens (lists)
In article ,
   Gavin Wraith  wrote:

> So many web pages these days are crammed with stuff put in by
> advertisers or third parties which upset the viewing experience,
> or which NetSurf is unable to render properly. The individual
> user is increasingly going to need tools to fight back, with
> which they can emasculate the page of all the clutter added in
> for statistical purposes, to add unwanted advertisements and
> pop-ups. The days of innocence are long over, so filtering of
> web pages is now a necessity.

NS has choices: 
Hide advertisements
Disable pop-up windows
Disable JavaScript

These, I would guess, tell it to ignore such content. It is then filtering
exactly as you seem to want!

So I guess you are asking for a throwback icon, to throw the source out to
an external editor, where it can be processed and returned back to NS?

This is perfectly possible within RISC OS. But I doubt that it is possible
in other OSes.

I hope the developers will contradict me and say it's totally possible -
but I have doubts!

-- 
Richard Torrens.
http://www.Torrens.org.uk for genealogy, natural history, wild food, walks, cats
and more!



Re: 2 sites that don't!

2016-01-26 Thread Gavin Wraith
In message <09d3064855@abbeypress.net>
  Jim Nagel  wrote:

>Did that on the first web page.  Got a security warning from StrongEd
>about "process"; I said "allow always".
>Then "File name '.in' not recognised".

Sounds like !StrongED$ScrapDir has not been set. See the line

Set StrongED$Script_Outfile .in

in !StrongED.Defaults.Tools.!ScriptSED.Tools.Despatch  .

--
Gavin Wraith (ga...@wra1th.plus.com)
Home page: http://www.wra1th.plus.com/



Re: Filtering webpages

2016-01-26 Thread Gavin Wraith
In message <55480e2bfeli...@torrens.org.uk>
  "Richard Torrens (lists)"  wrote:


>NS has choices:
>Hide advertisements
>Disable pop-up windows
>Disable JavaScript
>
>These, I would guess, tell it to ignore such content. It is then filtering
>exactly as you seem to want!

These are desirable but not general enough for my purposes.

>So I guess you are asking for a throwback icon, to throw the source out to
>an external editor, where it can be processed and returned back to NS?

Exactly.

I seem to remember using a browser on an Acorn machine, many years ago, that
gave you the option of not displaying images unless you specifically clicked
on the icon that the browser used to indicate a missing image.

I would also like to be able to discriminate content by source URL and to give 
permissions
for which should be blocked or which allowed through. But I suspect that this 
requires
filtering at the packet level. My feeling is that the majority of the public
have little idea about what is going on when they use the internet, as opposed
to the businesses which would like to exploit their ignorance. Web savvy 
programmers
produce the websites and also the browsers. I would like to see users take 
back, or
be given back (by appropriate tools) more control over this predator/prey 
scenario.
It is a question of freedom.

--
Gavin Wraith (ga...@wra1th.plus.com)
Home page: http://www.wra1th.plus.com/



/ Archive 23:12 white-on-grey

2016-01-26 Thread Jim Nagel
cj  wrote on 26 Jan:
> ... and what about the latest issue of Archive  where, on page 3 you
> have white text on a light grey background???

Yes; that was bad.  The contrast looked fine in the past, and looked 
fine this time too on screen, but unfortunately my printshop used a 
new printer.  Before next issue I'll ask him to put through a test 
sheet with samples of the whole range of tints.

-- 
Jim Nagelwww.archivemag.co.uk



Re: NetSurf progress

2016-01-26 Thread lists
In article <41035cf02c7.007bd...@davehigton.me.uk>,
   Dave Higton  wrote:
> NetSurf is fully capable of creating bug reports using the Mantis
> issue tracker, including attaching files and adding notes.  If
> you have a problem, ask on this list for help.

This morning I went through the process of signing up to the bug-tracker,
I'm still awaiting the confirmatory email to complete the process.

Though I used my private email not this address, which is kept for mailing
list ony.

-- 
Stuart Winsor

Tools With A Mission
sending tools across the world
http://www.twam.co.uk/



Re: JavaScript enabled builds for Atari

2016-01-26 Thread Ole Loots
Hello,

enabling JS for the atari builds was great :-) 

Attached is a screenshot of another testpage from the NetSurf JS
Testsuite. 

Things like  works fine.
document.getElementById() works, also document.write() works, at least
when called during the rendering phase. 

The JS Prime-Test page took 160 sec. to load here, which is expected by
me. 

document.getElementById('xyz').innerHTML = "New Content";

- does nothing. 


document.getElementById('textarea').value = "New Content";

- does set the correct value, but the GUI is not refreshed. If that
would work, NetSurf would be able to provide some kind of JS-Calculator,
using the textarea as the display... 

I can confirm that the ASCII-fractal is rendering fine, took 8.6 sec
with m68k-aranym (Clocking: 300Mhz).

Very promising, looking forward for some possibility to redraw the
page/HTML Elements :-) 

Greets,
Ole

Am Montag, den 25.01.2016, 21:12 + schrieb Peter Slegg:
> >Date: Sun, 24 Jan 2016 09:53:46 +
> >From: Michael Drake 
> >Subject: Re: JavaScript enabled builds for Atari
> >To: netsurf-users@netsurf-browser.org
> >
> >
> >
> >On 23/01/16 23:06, Peter Slegg wrote:
> >
> >> Using Netsurf JS today it has crashed a few times.
> >> It isn't as stable as the last non-JS versions.
> >
> >How about when you use it with "enable_javascript:0"?
> >
> >Michael Drake  http://www.netsurf-browser.org/
> 
> With JS disabled it seems to be as stable as normal, no crashes
> since disabling JS yesterday and using JS to visit 20-30 pages.
> 
> Is there a JS test page that can be used to check simple JS features ?
> 
> Regards,
> 
> Peter
> 
> 
> 
> 
> 
> 





Re: Filtering webpages

2016-01-26 Thread lists
In article <4ee4134855.ga...@wra1th.plus.com>,
   Gavin Wraith  wrote:
> I seem to remember using a browser on an Acorn machine, many years ago,
> that gave you the option of not displaying images unless you
> specifically clicked on the icon that the browser used to indicate a
> missing image.

Fresco

I still have it on this Kinetic. The display menu had the following
options, which could be ticked or not as required

No Pictures
Antialias
Document colours  (could turn off background colours to make content more 
   visible)
Frames
Tables
Stop animations   (There was a particular website I recall that had a
  black background with flashing "stars". It was horrid but
  the flashing could be turned off with this option)
Set width
Controls  (Buttons, URL bar, Status)

-- 
Stuart Winsor

Tools With A Mission
sending tools across the world
http://www.twam.co.uk/



Re: Big push on testing needed

2016-01-26 Thread Dave Higton
In message <57e4a64755.iyoj...@rickman.argonet.co.uk>
  John Rickman Iyonix  wrote:

>As far as I know javascript should ignore html comments and the 
>javascript validator does not flag them as errors
>
> http://www.javascriptlint.com/online_lint.php

My reference suggests that an HTML comment is /not/ a legal Javascript
comment.  Perhaps you should open a discussion with the author of the
above application.  I would, of course, be very interested to know the
conclusion!

Dave


FREE 3D EARTH SCREENSAVER - Watch the Earth right on your desktop!
Check it out at http://www.inbox.com/earth



Re: Big push on testing needed

2016-01-26 Thread Dave Higton
In message <5547e95991stuartli...@orpheusinternet.co.uk>
  lists  wrote:

>In article <3bc72c4755.davem...@my.inbox.com>,
>   Dave Higton  wrote:
>> Please, everybody, download the latest test build (which will,
>> of course, change as bugs are found and fixed), give it a good
>> thrashing, and get your bug reports in.
>
>A site I use a lot is
>
>http://cpc.farnell.com/
>
>Although it displays quite nicely, the "search" box, which should be within
>the broad blue banner at the top, to the right of "All products", isn't
>visible - is this fixable?
>
>#3312

Please raise an issue on Mantis.

NetSurf has an unsatisfactory layout engine.  It needs replacing.
This is clearly a big job.  However, please don't let that deter
you from raising the issue - it's not clear to me whether particular
issues, like the one you point out, can be fixed independently
of the major rework.

Dave


FREE ONLINE PHOTOSHARING - Share your photos online with your friends and 
family!
Visit http://www.inbox.com/photosharing to find out more!



Re: Big push on testing needed

2016-01-26 Thread lists
In article ,
   Dave Higton  wrote:
> Please raise an issue on Mantis.

At around 10 this morning, I went to the bug-tracker site and went through
the process of creating a login - user name, email, catchpa - all went ok
but I'm still waiting for the confirmation email to complete the process!

-- 
Stuart Winsor

Tools With A Mission
sending tools across the world
http://www.twam.co.uk/



Re: Big push on testing needed

2016-01-26 Thread John Rickman Iyonix
Dave Higton  wrote

> In message <57e4a64755.iyoj...@rickman.argonet.co.uk>
>   John Rickman Iyonix  wrote:

>>As far as I know javascript should ignore html comments and the
>>javascript validator does not flag them as errors
>>
>> http://www.javascriptlint.com/online_lint.php

> My reference suggests that an HTML comment is /not/ a legal Javascript
> comment.  Perhaps you should open a discussion with the author of the
> above application.  I would, of course, be very interested to know the
> conclusion!

It doesn't really matter now as I have changed all the comments to 
"proper" JavaScript style. I don't remember where I got the idea that 
html style was acceptable in JS, but the article in this link says 
they are ok:-
  http://www.javascripter.net/faq/comments.htm

I will investigate further.

John

-- 
John Rickman -  http://rickman.orpheusweb.co.uk/lynx



Re: Big push on testing needed

2016-01-26 Thread Chris Young
On 26 January 2016 23:58:51 GMT+00:00, John Rickman Iyonix 
 wrote:
>Dave Higton  wrote
>
>> In message <57e4a64755.iyoj...@rickman.argonet.co.uk>
>>   John Rickman Iyonix  wrote:
>
>>>As far as I know javascript should ignore html comments and the
>>>javascript validator does not flag them as errors
>>>
>>> http://www.javascriptlint.com/online_lint.php
>
>> My reference suggests that an HTML comment is /not/ a legal
>Javascript
>> comment.  Perhaps you should open a discussion with the author of the
>> above application.  I would, of course, be very interested to know
>the
>> conclusion!
>
>It doesn't really matter now as I have changed all the comments to 
>"proper" JavaScript style. I don't remember where I got the idea that 
>html style was acceptable in JS, but the article in this link says 
>they are ok:-
>  http://www.javascripter.net/faq/comments.htm
>
>I will investigate further.

They must be OK to some extent, because the old advice was to encapsulate 
everything within 

Re: 2 sites that don't!

2016-01-26 Thread Grahame Parish
In message <5547fb845eli...@torrens.org.uk>
  "Richard Torrens (lists)"  wrote:

> http://www.catbehaviourist.com/sure-feed-microchip-pet-feeder-review/

> http://www.zooplus.co.uk/shop/cats/cat_bowls_feeders/feeders/programma
> ble/479534?gclid=CImfsqu3x8oCFUORGwodHCUFYQ

> Both display very badly. First site all text is hidden. The source is very
> over-complicated, so I can't see why.

Not related to the problem with the sites, but I confirm that the 
feeders work very well.  We have several cats with different dietary  
requirements.

-- 
Grahame Parish



Re: 2 sites that don't!

2016-01-26 Thread Gavin Wraith
In message <5547fb845eli...@torrens.org.uk>
  "Richard Torrens (lists)"  wrote:

>http://www.catbehaviourist.com/sure-feed-microchip-pet-feeder-review/
>
>http://www.zooplus.co.uk/shop/cats/cat_bowls_feeders/feeders/programmable/479534?gclid=CImfsqu3x8oCFUORGwodHCUFYQ
>
>Both display very badly. First site all text is hidden. The source is very
>over-complicated, so I can't see why.

I have a useful little Lua script, called noscript, for use with StrongED that 
strips out all the
stuff between matching  and  tags. Here it is:

#! lua
io.input (arg[1])
local text = io.read "*all"
io.input ( )
local pat = "