On Mon, Feb 01, 2016 at 12:18:41AM +0000, lists wrote:
> In article <437fcf4a55.ricp@user.minijem.plus.com>,
> Richard Porter <ricp@minijem.plus.com> wrote:
> > On 31 Jan 2016 lists wrote:
>
> > > There is "My view|View issues|Change log|Roadmap"
>
> > > NOWHERE ON THAT PAGE IS A BOX THAT SAYS REPORT BUG HERE!
>
> > There should be a link "Report Issue" after "View issues".
>
> I'm not seeing that here - NetSurf #3312
You need to be logged in to report bugs. You have two accounts, one
created on 2013-12-09 12:19 ('stuartwinsor') and one create 2016-01-26
08:49 ('stuart'). I have no idea why you might not have received the
email, perhaps Orpheus classified it as spam or something.
I have asked Mantis to send password reset emails for the more recent
one. I'll leave the older one in case you happen to know the password
for it already.
B.
Sunday, 31 January 2016
Re: Bug reporting
In article <437fcf4a55.ricp@user.minijem.plus.com>,
Richard Porter <ricp@minijem.plus.com> wrote:
> On 31 Jan 2016 lists wrote:
> > There is "My view|View issues|Change log|Roadmap"
> > NOWHERE ON THAT PAGE IS A BOX THAT SAYS REPORT BUG HERE!
> There should be a link "Report Issue" after "View issues".
I know nothing about HTML but saving out the source, coverting to text and
extracting a selection, this would appear to be the relavent bit
--------------------------------------------------------------------
</select> <input type="submit" class="button-small" value="Switch"
/></form><a
href="http://bugs.netsurf-browser.org/mantis/issues_rss.php?project_id=0"><img src="/mantis/images/rss.png" alt="RSS" style="border-style: none;
margin: 5px; vertical-align: middle;" /></a></td></tr></table><table
class="width100" cellspacing="0"><tr><td class="menu"><a
href="/mantis/my_view_page.php">My View</a> | <a
href="/mantis/view_all_bug_page.php">View Issues</a> | <a
href="/mantis/changelog_page.php">Change Log</a> | <a
href="/mantis/roadmap_page.php">Roadmap</a></td><td class="menu right
nowrap"><form method="post" action="/mantis/jump_to_bug.php"><input
type="text" name="bug_id" size="10" class="small" value="Issue #"
onfocus="if (this.value == 'Issue #') this.value = ''" onblur="if
(this.value == '') this.value = 'Issue #'" /> <input type="submit"
class="button-small" value="Jump" /> </form></td></tr></table>
<div align="center">
<table class="hide" border="0" cellspacing="3" cellpadding="0">
---------------------------------------------------------------------
I can pick out
href="/mantis/my_view_page.php">My View</a> | <a
href="/mantis/view_all_bug_page.php">View Issues</a> | <a
href="/mantis/changelog_page.php">Change Log</a> | <a
href="/mantis/roadmap_page.php">Roadmap</a></td><td class="menu right
There is no reference there to "Report issue"
--
Stuart Winsor
Tools With A Mission
sending tools across the world
http://www.twam.co.uk/
Richard Porter <ricp@minijem.plus.com> wrote:
> On 31 Jan 2016 lists wrote:
> > There is "My view|View issues|Change log|Roadmap"
> > NOWHERE ON THAT PAGE IS A BOX THAT SAYS REPORT BUG HERE!
> There should be a link "Report Issue" after "View issues".
I know nothing about HTML but saving out the source, coverting to text and
extracting a selection, this would appear to be the relavent bit
--------------------------------------------------------------------
</select> <input type="submit" class="button-small" value="Switch"
/></form><a
href="http://bugs.netsurf-browser.org/mantis/issues_rss.php?project_id=0"><img src="/mantis/images/rss.png" alt="RSS" style="border-style: none;
margin: 5px; vertical-align: middle;" /></a></td></tr></table><table
class="width100" cellspacing="0"><tr><td class="menu"><a
href="/mantis/my_view_page.php">My View</a> | <a
href="/mantis/view_all_bug_page.php">View Issues</a> | <a
href="/mantis/changelog_page.php">Change Log</a> | <a
href="/mantis/roadmap_page.php">Roadmap</a></td><td class="menu right
nowrap"><form method="post" action="/mantis/jump_to_bug.php"><input
type="text" name="bug_id" size="10" class="small" value="Issue #"
onfocus="if (this.value == 'Issue #') this.value = ''" onblur="if
(this.value == '') this.value = 'Issue #'" /> <input type="submit"
class="button-small" value="Jump" /> </form></td></tr></table>
<div align="center">
<table class="hide" border="0" cellspacing="3" cellpadding="0">
---------------------------------------------------------------------
I can pick out
href="/mantis/my_view_page.php">My View</a> | <a
href="/mantis/view_all_bug_page.php">View Issues</a> | <a
href="/mantis/changelog_page.php">Change Log</a> | <a
href="/mantis/roadmap_page.php">Roadmap</a></td><td class="menu right
There is no reference there to "Report issue"
--
Stuart Winsor
Tools With A Mission
sending tools across the world
http://www.twam.co.uk/
Re: Bug reporting
In article <554ad1a000Stuartlists@orpheusinternet.co.uk>,
lists <Stuartlists@orpheusinternet.co.uk> wrote:
> In article <437fcf4a55.ricp@user.minijem.plus.com>,
> Richard Porter <ricp@minijem.plus.com> wrote:
> > On 31 Jan 2016 lists wrote:
> > > There is "My view|View issues|Change log|Roadmap"
> > > NOWHERE ON THAT PAGE IS A BOX THAT SAYS REPORT BUG HERE!
> > There should be a link "Report Issue" after "View issues".
> I'm not seeing that here - NetSurf #3312
Doesn't show up in !Fresco either.
--
Stuart Winsor
Tools With A Mission
sending tools across the world
http://www.twam.co.uk/
lists <Stuartlists@orpheusinternet.co.uk> wrote:
> In article <437fcf4a55.ricp@user.minijem.plus.com>,
> Richard Porter <ricp@minijem.plus.com> wrote:
> > On 31 Jan 2016 lists wrote:
> > > There is "My view|View issues|Change log|Roadmap"
> > > NOWHERE ON THAT PAGE IS A BOX THAT SAYS REPORT BUG HERE!
> > There should be a link "Report Issue" after "View issues".
> I'm not seeing that here - NetSurf #3312
Doesn't show up in !Fresco either.
--
Stuart Winsor
Tools With A Mission
sending tools across the world
http://www.twam.co.uk/
Re: Bug reporting
In article <437fcf4a55.ricp@user.minijem.plus.com>,
Richard Porter <ricp@minijem.plus.com> wrote:
> On 31 Jan 2016 lists wrote:
> > There is "My view|View issues|Change log|Roadmap"
> > NOWHERE ON THAT PAGE IS A BOX THAT SAYS REPORT BUG HERE!
> There should be a link "Report Issue" after "View issues".
I'm not seeing that here - NetSurf #3312
--
Stuart Winsor
Tools With A Mission
sending tools across the world
http://www.twam.co.uk/
Richard Porter <ricp@minijem.plus.com> wrote:
> On 31 Jan 2016 lists wrote:
> > There is "My view|View issues|Change log|Roadmap"
> > NOWHERE ON THAT PAGE IS A BOX THAT SAYS REPORT BUG HERE!
> There should be a link "Report Issue" after "View issues".
I'm not seeing that here - NetSurf #3312
--
Stuart Winsor
Tools With A Mission
sending tools across the world
http://www.twam.co.uk/
Re: Bug reporting
On 31 Jan 2016 lists wrote:
> There is "My view|View issues|Change log|Roadmap"
> NOWHERE ON THAT PAGE IS A BOX THAT SAYS REPORT BUG HERE!
There should be a link "Report Issue" after "View issues".
--
Richard Porter http://www.minijem.plus.com/
Skype: minijem2 mailto:ricp@minijem.plus.com
I don't want a "user experience" - I just want stuff that works.
> There is "My view|View issues|Change log|Roadmap"
> NOWHERE ON THAT PAGE IS A BOX THAT SAYS REPORT BUG HERE!
There should be a link "Report Issue" after "View issues".
--
Richard Porter http://www.minijem.plus.com/
Skype: minijem2 mailto:ricp@minijem.plus.com
I don't want a "user experience" - I just want stuff that works.
Bug reporting
Would someone please provide a page of step by step instructions on how to
actually report a bug.
Following a link on bug reporting, on Jan 26th I went here:
http://bugs.netsurf-browser.org/mantis/my_view_page.php
I selected "Sign up for new account" it took me a page where I entered a
username and email as requested. It all appeared to go through and it said
I was to receive a confirmation email, which I still have not received.
I later revisited this page entered my details again and it said my user
name was already taken, which is to be expected.
Having lost patience, tonight I returned to
http://bugs.netsurf-browser.org/mantis/my_view_page.php
and opted to log-in anonymously, which took me back to
http://bugs.netsurf-browser.org/mantis/my_view_page.ph
There is "|Login|signup for new account".
There is "My view|View issues|Change log|Roadmap"
NOWHERE ON THAT PAGE IS A BOX THAT SAYS REPORT BUG HERE!
Your Frustratedly
Stuart
--
Stuart Winsor
Tools With A Mission
sending tools across the world
http://www.twam.co.uk/
actually report a bug.
Following a link on bug reporting, on Jan 26th I went here:
http://bugs.netsurf-browser.org/mantis/my_view_page.php
I selected "Sign up for new account" it took me a page where I entered a
username and email as requested. It all appeared to go through and it said
I was to receive a confirmation email, which I still have not received.
I later revisited this page entered my details again and it said my user
name was already taken, which is to be expected.
Having lost patience, tonight I returned to
http://bugs.netsurf-browser.org/mantis/my_view_page.php
and opted to log-in anonymously, which took me back to
http://bugs.netsurf-browser.org/mantis/my_view_page.ph
There is "|Login|signup for new account".
There is "My view|View issues|Change log|Roadmap"
NOWHERE ON THAT PAGE IS A BOX THAT SAYS REPORT BUG HERE!
Your Frustratedly
Stuart
--
Stuart Winsor
Tools With A Mission
sending tools across the world
http://www.twam.co.uk/
More bugs fixed
More bugs have been fixed, thus closing Mantis cases 2413, 2415 and
2418. I recommend to everyone that you download a new CI build of
NetSurf.
Happy testing :-)
Dave
____________________________________________________________
Can't remember your password? Do you need a strong and secure password?
Use Password manager! It stores your passwords & protects your account.
Check it out at http://mysecurelogon.com/password-manager
2418. I recommend to everyone that you download a new CI build of
NetSurf.
Happy testing :-)
Dave
____________________________________________________________
Can't remember your password? Do you need a strong and secure password?
Use Password manager! It stores your passwords & protects your account.
Check it out at http://mysecurelogon.com/password-manager
Friday, 29 January 2016
Bugs To Focus On
Hi,
Where I can see a list of bugs I can focus on.
Or better, for a newcomer point me to something to work on.
I'm interested in the JS support eventually, but I want to start small.
Re: auto-launching a PDF
On 29 Jan 2016 Jim Nagel wrote:
> When Netsurf fetches a PDF file, would it be possible to trigger the
> PDF-reading application automatically, rather than requiring the user
> to save the file to disc and then launch it manually?
It would also be nice if NetSurf could automatically launch a playlist
(.m3u) file. Oregano does this. I raised a feature request for this a
long time ago.
--
Richard Porter http://www.minijem.plus.com/
Skype: minijem2 mailto:ricp@minijem.plus.com
I don't want a "user experience" - I just want stuff that works.
> When Netsurf fetches a PDF file, would it be possible to trigger the
> PDF-reading application automatically, rather than requiring the user
> to save the file to disc and then launch it manually?
It would also be nice if NetSurf could automatically launch a playlist
(.m3u) file. Oregano does this. I raised a feature request for this a
long time ago.
--
Richard Porter http://www.minijem.plus.com/
Skype: minijem2 mailto:ricp@minijem.plus.com
I don't want a "user experience" - I just want stuff that works.
auto-launching a PDF
When Netsurf fetches a PDF file, would it be possible to trigger the
PDF-reading application automatically, rather than requiring the user
to save the file to disc and then launch it manually?
(The manual procedure is too complex to explain to a technophobe Other
Half.)
What prompts me to ask is that I came across the following text inside
!PDFtest.Docs.Hints, which is dated 1998-03-24, apparently by Leo
Smiers (who ported the PDF reader from XPDF on Linux; Colin Granville
carried on the work):
=====
Hints dd 980309
=====
Starting !PDF from !Fresco version 1.32
It's possible to make Fresco start !PDF when it downloads
a PDF file. To do this, you must add lines to the
!InetSuite.Internet.Files.MimeMap and !Fresco.Runables
files.
In the MimeMap file add the line:
application/pdf PDF ADF .pdf
In !Fresco.Runables (if this file does not exists you have
to create it) add the line:
ADF
Now quit and restart !Fresco. When you view a .pdf file,
Fresco will filer_run the downloaded pdf file. If !PDF has
been seen, it will load and display the page.
Many thanks to Dean Murphy from ANT Ltd. Cambridge from
whom I recieved this information.
=====
Perhaps this facility exists already in Netsurf and I just don't know
about it.
--
Jim Nagel www.archivemag.co.uk
PDF-reading application automatically, rather than requiring the user
to save the file to disc and then launch it manually?
(The manual procedure is too complex to explain to a technophobe Other
Half.)
What prompts me to ask is that I came across the following text inside
!PDFtest.Docs.Hints, which is dated 1998-03-24, apparently by Leo
Smiers (who ported the PDF reader from XPDF on Linux; Colin Granville
carried on the work):
=====
Hints dd 980309
=====
Starting !PDF from !Fresco version 1.32
It's possible to make Fresco start !PDF when it downloads
a PDF file. To do this, you must add lines to the
!InetSuite.Internet.Files.MimeMap and !Fresco.Runables
files.
In the MimeMap file add the line:
application/pdf PDF ADF .pdf
In !Fresco.Runables (if this file does not exists you have
to create it) add the line:
ADF
Now quit and restart !Fresco. When you view a .pdf file,
Fresco will filer_run the downloaded pdf file. If !PDF has
been seen, it will load and display the page.
Many thanks to Dean Murphy from ANT Ltd. Cambridge from
whom I recieved this information.
=====
Perhaps this facility exists already in Netsurf and I just don't know
about it.
--
Jim Nagel www.archivemag.co.uk
Re: Filtering webpages
Gavin Wraith <gavin@wra1th.plus.com> wrote:
> In message <mpro.o1oojm0082tl406gk.atdotcodotuk@dotcodotukat.co.uk>
> Vince M Hudd <atdotcodotuk@dotcodotukat.co.uk> wrote:
[blocking domains via the host file]
> > If you can do something similar in your router, you will achieve the
> > same result for any computer on your network.
> Thanks for that tip. I realize that the idea of a webpage being hosted at
> a particular URL, with pictures, styles, javascript, ... etc being loaded
> in from elsewhere is not necessarily realistic.
It's actually very *common* - for scripts in particular, with advertising
and analytics services being the most obvious examples.
You *might* find blocking such things results in a speed gain from NetSurf
on some sites (it depends on a number of factors), and/or you might find the
resulting pages are less cluttered and easier to read.
(TBH, I'm not yet using RISC OS / NetSurf enough [yet] to be able to do
anything more than suggest it as an untested possibility.)
> It is often quite hard to see from the source html text exactly what is
> happening.
True enough.
--
Vince M Hudd
Soft Rock Software
Don't forget to vote in the 2015 RISC OS Awards:
www.riscosawards.co.uk/vote2015.html
> In message <mpro.o1oojm0082tl406gk.atdotcodotuk@dotcodotukat.co.uk>
> Vince M Hudd <atdotcodotuk@dotcodotukat.co.uk> wrote:
[blocking domains via the host file]
> > If you can do something similar in your router, you will achieve the
> > same result for any computer on your network.
> Thanks for that tip. I realize that the idea of a webpage being hosted at
> a particular URL, with pictures, styles, javascript, ... etc being loaded
> in from elsewhere is not necessarily realistic.
It's actually very *common* - for scripts in particular, with advertising
and analytics services being the most obvious examples.
You *might* find blocking such things results in a speed gain from NetSurf
on some sites (it depends on a number of factors), and/or you might find the
resulting pages are less cluttered and easier to read.
(TBH, I'm not yet using RISC OS / NetSurf enough [yet] to be able to do
anything more than suggest it as an untested possibility.)
> It is often quite hard to see from the source html text exactly what is
> happening.
True enough.
--
Vince M Hudd
Soft Rock Software
Don't forget to vote in the 2015 RISC OS Awards:
www.riscosawards.co.uk/vote2015.html
Re: Filtering webpages
In message <mpro.o1oojm0082tl406gk.atdotcodotuk@dotcodotukat.co.uk>
Vince M Hudd <atdotcodotuk@dotcodotukat.co.uk> wrote:
>At the domain level, you could add entries to your hosts file for sources
>you want blocked, along the lines of:
>
>127.0.0.1 google-analytics.com
>
>This will effectively stop websites you visit on that computer *with any
>browser* from loading anything from that domain.
>
>If you can do something similar in your router, you will achieve the same
>result for any computer on your network.
Thanks for that tip. I realize that the idea of a webpage being hosted
at a particular URL, with pictures, styles, javascript, ... etc being loaded
in from elsewhere is not necessarily realistic. It is often quite hard
to see from the source html text exactly what is happening.
--
Gavin Wraith (gavin@wra1th.plus.com)
Home page: http://www.wra1th.plus.com/
Vince M Hudd <atdotcodotuk@dotcodotukat.co.uk> wrote:
>At the domain level, you could add entries to your hosts file for sources
>you want blocked, along the lines of:
>
>127.0.0.1 google-analytics.com
>
>This will effectively stop websites you visit on that computer *with any
>browser* from loading anything from that domain.
>
>If you can do something similar in your router, you will achieve the same
>result for any computer on your network.
Thanks for that tip. I realize that the idea of a webpage being hosted
at a particular URL, with pictures, styles, javascript, ... etc being loaded
in from elsewhere is not necessarily realistic. It is often quite hard
to see from the source html text exactly what is happening.
--
Gavin Wraith (gavin@wra1th.plus.com)
Home page: http://www.wra1th.plus.com/
Thursday, 28 January 2016
Re: Filtering webpages
Gavin Wraith <gavin@wra1th.plus.com> wrote:
> I would also like to be able to discriminate content by source URL and to
> give permissions for which should be blocked or which allowed through.
At the domain level, you could add entries to your hosts file for sources
you want blocked, along the lines of:
127.0.0.1 google-analytics.com
This will effectively stop websites you visit on that computer *with any
browser* from loading anything from that domain.
If you can do something similar in your router, you will achieve the same
result for any computer on your network.
--
Vince M Hudd
Soft Rock Software
Don't forget to vote in the 2015 RISC OS Awards:
www.riscosawards.co.uk/vote2015.html
> I would also like to be able to discriminate content by source URL and to
> give permissions for which should be blocked or which allowed through.
At the domain level, you could add entries to your hosts file for sources
you want blocked, along the lines of:
127.0.0.1 google-analytics.com
This will effectively stop websites you visit on that computer *with any
browser* from loading anything from that domain.
If you can do something similar in your router, you will achieve the same
result for any computer on your network.
--
Vince M Hudd
Soft Rock Software
Don't forget to vote in the 2015 RISC OS Awards:
www.riscosawards.co.uk/vote2015.html
Re: Very slow page rendering
Am Donnerstag, den 28.01.2016, 21:04 +0000 schrieb Peter Slegg:
>
> https://www.royalmail.com/track-your-item
>
> Another page that took ages to display and looked like the css had
> failed as well.
>
Early versions of the atari port crashed because of stack size issues
(caused within mintlib regex module, called by NetSurf CSS parser, while
it was processing exorbitant strings (3k, which expanded to several
MegaBytes within mintlib regex module)...).
I prevented the crash by compiling mintlib with the
+DEFS=-DREGEX_MALLOC
define.
This is also applied to the CI builds.
Just a guess: there is some kind of slowdown when doing excessive malloc
operations with MiNT.
Greets,
Ole
>
> https://www.royalmail.com/track-your-item
>
> Another page that took ages to display and looked like the css had
> failed as well.
>
Early versions of the atari port crashed because of stack size issues
(caused within mintlib regex module, called by NetSurf CSS parser, while
it was processing exorbitant strings (3k, which expanded to several
MegaBytes within mintlib regex module)...).
I prevented the crash by compiling mintlib with the
+DEFS=-DREGEX_MALLOC
define.
This is also applied to the CI builds.
Just a guess: there is some kind of slowdown when doing excessive malloc
operations with MiNT.
Greets,
Ole
Re: Very slow page rendering
> Date: Sun, 10 Jan 2016 16:43:40 +0100
> From: Jean-Fran?ois Lemaire <jflemaire@skynet.be>
> Subject: Re: Very slow page rendering
>
> On Saturday 09 January 2016 19:43:53 Peter Slegg wrote:
> > >Date: Sat, 09 Jan 2016 16:11:40 GMT
>
> > >> Peter Slegg <p.slegg@scubadivers.co.uk> wrote:
> > >>>http://git.netsurf-browser.org/netsurf.git/tree/atari/gemtk/guiwin.c
>
> > >>>This page takes abut 20mins to download and render, Highwire browser
> > >>>takes about 6sec.
>
> > No criticism, I am hoping this might help the devs find some speed
> > improvements.
>
> I have an Atari 2.9 version lying around and with that build it takes 100 secs
> to render that page. Still very slow but much less so than with the 3.*
> builds.
>
> Cheers,
> JFL
https://www.royalmail.com/track-your-item
Another page that took ages to display and looked like the css had
failed as well.
Peter
> From: Jean-Fran?ois Lemaire <jflemaire@skynet.be>
> Subject: Re: Very slow page rendering
>
> On Saturday 09 January 2016 19:43:53 Peter Slegg wrote:
> > >Date: Sat, 09 Jan 2016 16:11:40 GMT
>
> > >> Peter Slegg <p.slegg@scubadivers.co.uk> wrote:
> > >>>http://git.netsurf-browser.org/netsurf.git/tree/atari/gemtk/guiwin.c
>
> > >>>This page takes abut 20mins to download and render, Highwire browser
> > >>>takes about 6sec.
>
> > No criticism, I am hoping this might help the devs find some speed
> > improvements.
>
> I have an Atari 2.9 version lying around and with that build it takes 100 secs
> to render that page. Still very slow but much less so than with the 3.*
> builds.
>
> Cheers,
> JFL
https://www.royalmail.com/track-your-item
Another page that took ages to display and looked like the css had
failed as well.
Peter
Re: Big push on testing needed
In message <5548c5c513cvjazz@waitrose.com>
Chris Newman <cvjazz@waitrose.com> wrote:
> In article <3bc72c4755.DaveMeUK@my.inbox.com>,
> Dave Higton <dave@davehigton.me.uk> wrote:
>> Big news...
>> Current test (CI) builds are now release candidates. Yes, a new
>> release of NetSurf is imminent.
>> Please, everybody, download the latest test build (which will,
>> of course, change as bugs are found and fixed), give it a good
>> thrashing, and get your bug reports in.
>> Please also note that, since it's now close to release time, the
>> Javascript setting in Choices->Content is obeyed (and has been
>> for a couple of days or so now).
> Greetings from sunny Australia (gloat, gloat),
> The 38 Degrees petition page at....
> https://secure.38degrees.org.uk/Scotland-stop-CETA
> Is a bit of a pigs dinner. It takes ages to load, frames overlap & the
> signing link doesn't work.
> Dev CI #3315 on Virtual Acorn Adjust 4.39
> Same effects with JS on or off.
> Works OK using Maxthon browser on the Windows side.
> Does anyone see the same effects?
2.6s JS off, 5.4s JS on, CI #3312, 5.21 (RC14), Pi 2 @ 900MHz. Page
display is substantially different compared to Otter 0.9.09 on RISC
OS. The signing link doesn't work in Netsurf with JS on or off, but
Otter seems fully functional with JS enabled. I didn't make a precise
count of page loading time in Otter - it's considerably slower than
NS, about 20-30 secs JS off/on.
--
George
Chris Newman <cvjazz@waitrose.com> wrote:
> In article <3bc72c4755.DaveMeUK@my.inbox.com>,
> Dave Higton <dave@davehigton.me.uk> wrote:
>> Big news...
>> Current test (CI) builds are now release candidates. Yes, a new
>> release of NetSurf is imminent.
>> Please, everybody, download the latest test build (which will,
>> of course, change as bugs are found and fixed), give it a good
>> thrashing, and get your bug reports in.
>> Please also note that, since it's now close to release time, the
>> Javascript setting in Choices->Content is obeyed (and has been
>> for a couple of days or so now).
> Greetings from sunny Australia (gloat, gloat),
> The 38 Degrees petition page at....
> https://secure.38degrees.org.uk/Scotland-stop-CETA
> Is a bit of a pigs dinner. It takes ages to load, frames overlap & the
> signing link doesn't work.
> Dev CI #3315 on Virtual Acorn Adjust 4.39
> Same effects with JS on or off.
> Works OK using Maxthon browser on the Windows side.
> Does anyone see the same effects?
2.6s JS off, 5.4s JS on, CI #3312, 5.21 (RC14), Pi 2 @ 900MHz. Page
display is substantially different compared to Otter 0.9.09 on RISC
OS. The signing link doesn't work in Netsurf with JS on or off, but
Otter seems fully functional with JS enabled. I didn't make a precise
count of page loading time in Otter - it's considerably slower than
NS, about 20-30 secs JS off/on.
--
George
Wednesday, 27 January 2016
Re: Big push on testing needed
In article <5548c5c513cvjazz@waitrose.com>,
Chris Newman <cvjazz@waitrose.com> wrote:
> In article <3bc72c4755.DaveMeUK@my.inbox.com>,
> Dave Higton <dave@davehigton.me.uk> wrote:
> > Big news...
> > Current test (CI) builds are now release candidates. Yes, a new
> > release of NetSurf is imminent.
> > Please, everybody, download the latest test build (which will,
> > of course, change as bugs are found and fixed), give it a good
> > thrashing, and get your bug reports in.
> > Please also note that, since it's now close to release time, the
> > Javascript setting in Choices->Content is obeyed (and has been
> > for a couple of days or so now).
> Greetings from sunny Australia (gloat, gloat),
> The 38 Degrees petition page at....
> https://secure.38degrees.org.uk/Scotland-stop-CETA
> Is a bit of a pigs dinner. It takes ages to load, frames overlap & the
> signing link doesn't work.
10.2 secs here and only one frame overlaps. Not really a pig's dinner, not
even a dog's breakfast, I would think!?! Don't know about the signing in
link, something happens but not sure if it does so correctly, or not.
#3315 on VRPC 4.02
> Dev CI #3315 on Virtual Acorn Adjust 4.39
> Same effects with JS on or off.
> Works OK using Maxthon browser on the Windows side.
> Does anyone see the same effects?
Chris Newman <cvjazz@waitrose.com> wrote:
> In article <3bc72c4755.DaveMeUK@my.inbox.com>,
> Dave Higton <dave@davehigton.me.uk> wrote:
> > Big news...
> > Current test (CI) builds are now release candidates. Yes, a new
> > release of NetSurf is imminent.
> > Please, everybody, download the latest test build (which will,
> > of course, change as bugs are found and fixed), give it a good
> > thrashing, and get your bug reports in.
> > Please also note that, since it's now close to release time, the
> > Javascript setting in Choices->Content is obeyed (and has been
> > for a couple of days or so now).
> Greetings from sunny Australia (gloat, gloat),
> The 38 Degrees petition page at....
> https://secure.38degrees.org.uk/Scotland-stop-CETA
> Is a bit of a pigs dinner. It takes ages to load, frames overlap & the
> signing link doesn't work.
10.2 secs here and only one frame overlaps. Not really a pig's dinner, not
even a dog's breakfast, I would think!?! Don't know about the signing in
link, something happens but not sure if it does so correctly, or not.
#3315 on VRPC 4.02
> Dev CI #3315 on Virtual Acorn Adjust 4.39
> Same effects with JS on or off.
> Works OK using Maxthon browser on the Windows side.
> Does anyone see the same effects?
Re: Big push on testing needed
In article <3bc72c4755.DaveMeUK@my.inbox.com>,
Dave Higton <dave@davehigton.me.uk> wrote:
> Big news...
> Current test (CI) builds are now release candidates. Yes, a new
> release of NetSurf is imminent.
> Please, everybody, download the latest test build (which will,
> of course, change as bugs are found and fixed), give it a good
> thrashing, and get your bug reports in.
> Please also note that, since it's now close to release time, the
> Javascript setting in Choices->Content is obeyed (and has been
> for a couple of days or so now).
Greetings from sunny Australia (gloat, gloat),
The 38 Degrees petition page at....
https://secure.38degrees.org.uk/Scotland-stop-CETA
Is a bit of a pigs dinner. It takes ages to load, frames overlap & the
signing link doesn't work.
Dev CI #3315 on Virtual Acorn Adjust 4.39
Same effects with JS on or off.
Works OK using Maxthon browser on the Windows side.
Does anyone see the same effects?
--
Chris
---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus
Dave Higton <dave@davehigton.me.uk> wrote:
> Big news...
> Current test (CI) builds are now release candidates. Yes, a new
> release of NetSurf is imminent.
> Please, everybody, download the latest test build (which will,
> of course, change as bugs are found and fixed), give it a good
> thrashing, and get your bug reports in.
> Please also note that, since it's now close to release time, the
> Javascript setting in Choices->Content is obeyed (and has been
> for a couple of days or so now).
Greetings from sunny Australia (gloat, gloat),
The 38 Degrees petition page at....
https://secure.38degrees.org.uk/Scotland-stop-CETA
Is a bit of a pigs dinner. It takes ages to load, frames overlap & the
signing link doesn't work.
Dev CI #3315 on Virtual Acorn Adjust 4.39
Same effects with JS on or off.
Works OK using Maxthon browser on the Windows side.
Does anyone see the same effects?
--
Chris
---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus
Re: Big push on testing needed
Michael Drake, on 27 Jan, wrote:
> On 27/01/16 18:05, John Rickman Iyonix wrote:
> > Daniel Silverstone wrote
>
> > > Have you reported this, along with an *attached* test case, to the
> > > BTS? If so, can you let me know the issue number?
>
> > Michael Drake has reported it upstream to Duktape
>
> That doesn't change the need for a report in the NetSurf bug tracker. The
> upstream may decide that the fix doesn't belong in duktape, or they could
> implement it, but that would fix nothing for NetSurf users unless we
> update duktape in the NetSurf source.
>
> We have an issue tracker to keep track of all the things we need to do.
> It's impossible to remember everything.
>
> I've created an issue here:
>
> http://bugs.netsurf-browser.org/mantis/view.php?id=2416
>
> I'd be grateful if someone could attach a test case.
Done.
--
David Pitt
> On 27/01/16 18:05, John Rickman Iyonix wrote:
> > Daniel Silverstone wrote
>
> > > Have you reported this, along with an *attached* test case, to the
> > > BTS? If so, can you let me know the issue number?
>
> > Michael Drake has reported it upstream to Duktape
>
> That doesn't change the need for a report in the NetSurf bug tracker. The
> upstream may decide that the fix doesn't belong in duktape, or they could
> implement it, but that would fix nothing for NetSurf users unless we
> update duktape in the NetSurf source.
>
> We have an issue tracker to keep track of all the things we need to do.
> It's impossible to remember everything.
>
> I've created an issue here:
>
> http://bugs.netsurf-browser.org/mantis/view.php?id=2416
>
> I'd be grateful if someone could attach a test case.
Done.
--
David Pitt
Re: Big push on testing needed
On 27/01/16 18:05, John Rickman Iyonix wrote:
> Daniel Silverstone wrote
>> Have you reported this, along with an *attached* test case, to the BTS?
>> If so, can you let me know the issue number?
> Michael Drake has reported it upstream to Duktape
That doesn't change the need for a report in the NetSurf bug tracker.
The upstream may decide that the fix doesn't belong in duktape, or
they could implement it, but that would fix nothing for NetSurf
users unless we update duktape in the NetSurf source.
We have an issue tracker to keep track of all the things we need to
do. It's impossible to remember everything.
I've created an issue here:
http://bugs.netsurf-browser.org/mantis/view.php?id=2416
I'd be grateful if someone could attach a test case.
Cheers,
--
Michael Drake http://www.netsurf-browser.org/
> Daniel Silverstone wrote
>> Have you reported this, along with an *attached* test case, to the BTS?
>> If so, can you let me know the issue number?
> Michael Drake has reported it upstream to Duktape
That doesn't change the need for a report in the NetSurf bug tracker.
The upstream may decide that the fix doesn't belong in duktape, or
they could implement it, but that would fix nothing for NetSurf
users unless we update duktape in the NetSurf source.
We have an issue tracker to keep track of all the things we need to
do. It's impossible to remember everything.
I've created an issue here:
http://bugs.netsurf-browser.org/mantis/view.php?id=2416
I'd be grateful if someone could attach a test case.
Cheers,
--
Michael Drake http://www.netsurf-browser.org/
Re: Big push on testing needed
Daniel Silverstone wrote
> On Mon, Jan 25, 2016 at 20:43:19 +0000, John Rickman Iyonix wrote:
>> My javascript is now working. the problem was that the new interpreter
>> does not like html comments between the <script> and </script> tags.
> Have you reported this, along with an *attached* test case, to the BTS?
> If so, can you let me know the issue number?
Hello Daniel,
Michael Drake has reported it upstream to Duktape
https://github.com/svaarala/duktape/pull/564
John
--
John Rickman - http://rickman.orpheusweb.co.uk/lynx
No s� nada, y no estoy seguro de �so
> On Mon, Jan 25, 2016 at 20:43:19 +0000, John Rickman Iyonix wrote:
>> My javascript is now working. the problem was that the new interpreter
>> does not like html comments between the <script> and </script> tags.
> Have you reported this, along with an *attached* test case, to the BTS?
> If so, can you let me know the issue number?
Hello Daniel,
Michael Drake has reported it upstream to Duktape
https://github.com/svaarala/duktape/pull/564
John
--
John Rickman - http://rickman.orpheusweb.co.uk/lynx
No s� nada, y no estoy seguro de �so
Re: Big push on testing needed
On 27/01/16 10:33, John Rickman Iyonix wrote:
> Assuming that that JavaScript is entirely defined by the ECMAScript
> standard, it seems clear that html style comments are valid.
Thanks John. This has been reported upstream to the maintainer of
Duktape (the JavaScript engine we use).
You can follow the issue here:
https://github.com/svaarala/duktape/pull/564
Cheers,
--
Michael Drake http://www.codethink.co.uk/
> Assuming that that JavaScript is entirely defined by the ECMAScript
> standard, it seems clear that html style comments are valid.
Thanks John. This has been reported upstream to the maintainer of
Duktape (the JavaScript engine we use).
You can follow the issue here:
https://github.com/svaarala/duktape/pull/564
Cheers,
--
Michael Drake http://www.codethink.co.uk/
Incorrect rendering of SVG files
Hello!
As long as the support of SVG on GTK is marked as
complete I think the result I'm seeing is a bug.
Consider the following SVG file:
<?xml version="1.0" encoding="UTF-8"?>
<svg xmlns="http://www.w3.org/2000/svg" width="32" height="32">
<rect style="fill:red" y="0" x="0" height="32" width="32" />
</svg>
The image above is rendered as two rectangles side-by-side.
On red and another one blue while it appears all red in any other
image viewer, image editor or web browser.
See the screenshot at: http://ogg.ge/netsurf.png
I'm using NetSurf 3.4 on Debian Jessie GNU/Linux.
I will happily share any other technical details if needed.
Thanks.
As long as the support of SVG on GTK is marked as
complete I think the result I'm seeing is a bug.
Consider the following SVG file:
<?xml version="1.0" encoding="UTF-8"?>
<svg xmlns="http://www.w3.org/2000/svg" width="32" height="32">
<rect style="fill:red" y="0" x="0" height="32" width="32" />
</svg>
The image above is rendered as two rectangles side-by-side.
On red and another one blue while it appears all red in any other
image viewer, image editor or web browser.
See the screenshot at: http://ogg.ge/netsurf.png
I'm using NetSurf 3.4 on Debian Jessie GNU/Linux.
I will happily share any other technical details if needed.
Thanks.
Re: Filtering webpages
Stuart Winsor (aka "lists") wrote on 26 Jan:
> Fresco ... display menu had the following
> options, which could be ticked or not as required
> Document colours (could turn off background colours to make content visible)
...
> Controls (Buttons, URL bar, Status)
Yes. I have always missed these two features in Netsurf, especially
Fresco's control button that toggled ECMAscript (Javascript) on or off
(and showed its current state).
--
Jim Nagel www.archivemag.co.uk
> Fresco ... display menu had the following
> options, which could be ticked or not as required
> Document colours (could turn off background colours to make content visible)
...
> Controls (Buttons, URL bar, Status)
Yes. I have always missed these two features in Netsurf, especially
Fresco's control button that toggled ECMAscript (Javascript) on or off
(and showed its current state).
--
Jim Nagel www.archivemag.co.uk
Re: Big push on testing needed
On Mon, Jan 25, 2016 at 20:43:19 +0000, John Rickman Iyonix wrote:
> My javascript is now working. the problem was that the new interpreter
> does not like html comments between the <script> and </script> tags.
Have you reported this, along with an *attached* test case, to the BTS?
If so, can you let me know the issue number?
D.
--
Daniel Silverstone http://www.netsurf-browser.org/
PGP mail accepted and encouraged. Key Id: 3CCE BABE 206C 3B69
> My javascript is now working. the problem was that the new interpreter
> does not like html comments between the <script> and </script> tags.
Have you reported this, along with an *attached* test case, to the BTS?
If so, can you let me know the issue number?
D.
--
Daniel Silverstone http://www.netsurf-browser.org/
PGP mail accepted and encouraged. Key Id: 3CCE BABE 206C 3B69
Re: Big push on testing needed
>Dave Higton wrote
> My reference suggests that an HTML comment is /not/ a legal Javascript
> comment. Perhaps you should open a discussion with the author of the
> above application. I would, of course, be very interested to know the
> conclusion!
Dave - below is an extract from the
"Ecma Standard definition of the ECMAScript 2015 Language",
Assuming that that JavaScript is entirely defined by the ECMAScript
standard, it seems clear that html style comments are valid.
*****************************************************************
"Annex B (normative) Additional ECMAScript Features for Web Browsers"
...
B.1.3 HTML-like Comments
The syntax and semantics of 11.4 is extended as follows except that
this extension is not when parsing source
code using the goal symbol Module:
Syntax
Comment ::
MultiLineComment
SingleLineComment
SingleLineHTMLOpenComment
SingleLineHTMLCloseComment
SingleLineDelimitedComment
MultiLineComment ::
/* FirstCommentLineopt LineTerminator MultiLineCommentCharsopt
*/ HTMLCloseCommentopt
FirstCommentLine ::
SingleLineDelimitedCommentChars
SingleLineHTMLOpenComment ::
<!-- SingleLineCommentCharsopt
SingleLineHTMLCloseComment ::
LineTerminatorSequence HTMLCloseComment
SingleLineDelimitedComment ::
/* SingleLineDelimitedCommentCharsopt */
HTMLCloseComment ::
WhiteSpaceSequenceopt SingleLineDelimitedCommentSequenceopt -->
SingleLineCommentCharsopt
SingleLineDelimitedCommentChars ::
SingleLineNotAsteriskChar SingleLineDelimitedCommentCharsopt
* SingleLinePostAsteriskCommentCharsopt
SingleLineNotAsteriskChar ::
SourceCharacter but not one of * or LineTerminator
SingleLinePostAsteriskCommentChars ::
SingleLineNotForwardSlashOrAsteriskChar
SingleLineDelimitedCommentCharsopt
* SingleLinePostAsteriskCommentCharsopt
SingleLineNotForwardSlashOrAsteriskChar ::
SourceCharacter but not one of / or * or LineTerminator
WhiteSpaceSequence ::
WhiteSpace WhiteSpaceSequenceopt
SingleLineDelimitedCommentSequence ::
SingleLineDelimitedComment WhiteSpaceSequenceopt
SingleLineDelimitedCommentSequenceopt
Copyright Ecma International 2015
page 523
*****************************************************************
--
> My reference suggests that an HTML comment is /not/ a legal Javascript
> comment. Perhaps you should open a discussion with the author of the
> above application. I would, of course, be very interested to know the
> conclusion!
Dave - below is an extract from the
"Ecma Standard definition of the ECMAScript 2015 Language",
Assuming that that JavaScript is entirely defined by the ECMAScript
standard, it seems clear that html style comments are valid.
*****************************************************************
"Annex B (normative) Additional ECMAScript Features for Web Browsers"
...
B.1.3 HTML-like Comments
The syntax and semantics of 11.4 is extended as follows except that
this extension is not when parsing source
code using the goal symbol Module:
Syntax
Comment ::
MultiLineComment
SingleLineComment
SingleLineHTMLOpenComment
SingleLineHTMLCloseComment
SingleLineDelimitedComment
MultiLineComment ::
/* FirstCommentLineopt LineTerminator MultiLineCommentCharsopt
*/ HTMLCloseCommentopt
FirstCommentLine ::
SingleLineDelimitedCommentChars
SingleLineHTMLOpenComment ::
<!-- SingleLineCommentCharsopt
SingleLineHTMLCloseComment ::
LineTerminatorSequence HTMLCloseComment
SingleLineDelimitedComment ::
/* SingleLineDelimitedCommentCharsopt */
HTMLCloseComment ::
WhiteSpaceSequenceopt SingleLineDelimitedCommentSequenceopt -->
SingleLineCommentCharsopt
SingleLineDelimitedCommentChars ::
SingleLineNotAsteriskChar SingleLineDelimitedCommentCharsopt
* SingleLinePostAsteriskCommentCharsopt
SingleLineNotAsteriskChar ::
SourceCharacter but not one of * or LineTerminator
SingleLinePostAsteriskCommentChars ::
SingleLineNotForwardSlashOrAsteriskChar
SingleLineDelimitedCommentCharsopt
* SingleLinePostAsteriskCommentCharsopt
SingleLineNotForwardSlashOrAsteriskChar ::
SourceCharacter but not one of / or * or LineTerminator
WhiteSpaceSequence ::
WhiteSpace WhiteSpaceSequenceopt
SingleLineDelimitedCommentSequence ::
SingleLineDelimitedComment WhiteSpaceSequenceopt
SingleLineDelimitedCommentSequenceopt
Copyright Ecma International 2015
page 523
*****************************************************************
--
Tuesday, 26 January 2016
Re: Big push on testing needed
On 26 January 2016 23:58:51 GMT+00:00, John Rickman Iyonix <rickman@argonet.co.uk> wrote:
>Dave Higton wrote
>
>> In message <57e4a64755.iyojohn@rickman.argonet.co.uk>
>> John Rickman Iyonix <rickman@argonet.co.uk> wrote:
>
>>>As far as I know javascript should ignore html comments and the
>>>javascript validator does not flag them as errors
>>>
>>> http://www.javascriptlint.com/online_lint.php
>
>> My reference suggests that an HTML comment is /not/ a legal
>Javascript
>> comment. Perhaps you should open a discussion with the author of the
>> above application. I would, of course, be very interested to know
>the
>> conclusion!
>
>It doesn't really matter now as I have changed all the comments to
>"proper" JavaScript style. I don't remember where I got the idea that
>html style was acceptable in JS, but the article in this link says
>they are ok:-
> http://www.javascripter.net/faq/comments.htm
>
>I will investigate further.
They must be OK to some extent, because the old advice was to encapsulate everything within <script> tags within an HTML comment to stop the script being displayed on screen.
I don't know if that practice is still common, but NetSurf should probably support it if it doesn't already. (<style> tags have the same advice btw)
Chris
>Dave Higton wrote
>
>> In message <57e4a64755.iyojohn@rickman.argonet.co.uk>
>> John Rickman Iyonix <rickman@argonet.co.uk> wrote:
>
>>>As far as I know javascript should ignore html comments and the
>>>javascript validator does not flag them as errors
>>>
>>> http://www.javascriptlint.com/online_lint.php
>
>> My reference suggests that an HTML comment is /not/ a legal
>Javascript
>> comment. Perhaps you should open a discussion with the author of the
>> above application. I would, of course, be very interested to know
>the
>> conclusion!
>
>It doesn't really matter now as I have changed all the comments to
>"proper" JavaScript style. I don't remember where I got the idea that
>html style was acceptable in JS, but the article in this link says
>they are ok:-
> http://www.javascripter.net/faq/comments.htm
>
>I will investigate further.
They must be OK to some extent, because the old advice was to encapsulate everything within <script> tags within an HTML comment to stop the script being displayed on screen.
I don't know if that practice is still common, but NetSurf should probably support it if it doesn't already. (<style> tags have the same advice btw)
Chris
Re: Big push on testing needed
Dave Higton wrote
> In message <57e4a64755.iyojohn@rickman.argonet.co.uk>
> John Rickman Iyonix <rickman@argonet.co.uk> wrote:
>>As far as I know javascript should ignore html comments and the
>>javascript validator does not flag them as errors
>>
>> http://www.javascriptlint.com/online_lint.php
> My reference suggests that an HTML comment is /not/ a legal Javascript
> comment. Perhaps you should open a discussion with the author of the
> above application. I would, of course, be very interested to know the
> conclusion!
It doesn't really matter now as I have changed all the comments to
"proper" JavaScript style. I don't remember where I got the idea that
html style was acceptable in JS, but the article in this link says
they are ok:-
http://www.javascripter.net/faq/comments.htm
I will investigate further.
John
--
John Rickman - http://rickman.orpheusweb.co.uk/lynx
> In message <57e4a64755.iyojohn@rickman.argonet.co.uk>
> John Rickman Iyonix <rickman@argonet.co.uk> wrote:
>>As far as I know javascript should ignore html comments and the
>>javascript validator does not flag them as errors
>>
>> http://www.javascriptlint.com/online_lint.php
> My reference suggests that an HTML comment is /not/ a legal Javascript
> comment. Perhaps you should open a discussion with the author of the
> above application. I would, of course, be very interested to know the
> conclusion!
It doesn't really matter now as I have changed all the comments to
"proper" JavaScript style. I don't remember where I got the idea that
html style was acceptable in JS, but the article in this link says
they are ok:-
http://www.javascripter.net/faq/comments.htm
I will investigate further.
John
--
John Rickman - http://rickman.orpheusweb.co.uk/lynx
Re: Big push on testing needed
In article <cc37374855.DaveMeUK@my.inbox.com>,
Dave Higton <dave@davehigton.me.uk> wrote:
> Please raise an issue on Mantis.
At around 10 this morning, I went to the bug-tracker site and went through
the process of creating a login - user name, email, catchpa - all went ok
but I'm still waiting for the confirmation email to complete the process!
--
Stuart Winsor
Tools With A Mission
sending tools across the world
http://www.twam.co.uk/
Dave Higton <dave@davehigton.me.uk> wrote:
> Please raise an issue on Mantis.
At around 10 this morning, I went to the bug-tracker site and went through
the process of creating a login - user name, email, catchpa - all went ok
but I'm still waiting for the confirmation email to complete the process!
--
Stuart Winsor
Tools With A Mission
sending tools across the world
http://www.twam.co.uk/
Re: Big push on testing needed
In message <5547e95991Stuartlists@orpheusinternet.co.uk>
lists <Stuartlists@orpheusinternet.co.uk> wrote:
>In article <3bc72c4755.DaveMeUK@my.inbox.com>,
> Dave Higton <dave@davehigton.me.uk> wrote:
>> Please, everybody, download the latest test build (which will,
>> of course, change as bugs are found and fixed), give it a good
>> thrashing, and get your bug reports in.
>
>A site I use a lot is
>
>http://cpc.farnell.com/
>
>Although it displays quite nicely, the "search" box, which should be within
>the broad blue banner at the top, to the right of "All products", isn't
>visible - is this fixable?
>
>#3312
Please raise an issue on Mantis.
NetSurf has an unsatisfactory layout engine. It needs replacing.
This is clearly a big job. However, please don't let that deter
you from raising the issue - it's not clear to me whether particular
issues, like the one you point out, can be fixed independently
of the major rework.
Dave
____________________________________________________________
FREE ONLINE PHOTOSHARING - Share your photos online with your friends and family!
Visit http://www.inbox.com/photosharing to find out more!
lists <Stuartlists@orpheusinternet.co.uk> wrote:
>In article <3bc72c4755.DaveMeUK@my.inbox.com>,
> Dave Higton <dave@davehigton.me.uk> wrote:
>> Please, everybody, download the latest test build (which will,
>> of course, change as bugs are found and fixed), give it a good
>> thrashing, and get your bug reports in.
>
>A site I use a lot is
>
>http://cpc.farnell.com/
>
>Although it displays quite nicely, the "search" box, which should be within
>the broad blue banner at the top, to the right of "All products", isn't
>visible - is this fixable?
>
>#3312
Please raise an issue on Mantis.
NetSurf has an unsatisfactory layout engine. It needs replacing.
This is clearly a big job. However, please don't let that deter
you from raising the issue - it's not clear to me whether particular
issues, like the one you point out, can be fixed independently
of the major rework.
Dave
____________________________________________________________
FREE ONLINE PHOTOSHARING - Share your photos online with your friends and family!
Visit http://www.inbox.com/photosharing to find out more!
Re: Big push on testing needed
In message <57e4a64755.iyojohn@rickman.argonet.co.uk>
John Rickman Iyonix <rickman@argonet.co.uk> wrote:
>As far as I know javascript should ignore html comments and the
>javascript validator does not flag them as errors
>
> http://www.javascriptlint.com/online_lint.php
My reference suggests that an HTML comment is /not/ a legal Javascript
comment. Perhaps you should open a discussion with the author of the
above application. I would, of course, be very interested to know the
conclusion!
Dave
____________________________________________________________
FREE 3D EARTH SCREENSAVER - Watch the Earth right on your desktop!
Check it out at http://www.inbox.com/earth
John Rickman Iyonix <rickman@argonet.co.uk> wrote:
>As far as I know javascript should ignore html comments and the
>javascript validator does not flag them as errors
>
> http://www.javascriptlint.com/online_lint.php
My reference suggests that an HTML comment is /not/ a legal Javascript
comment. Perhaps you should open a discussion with the author of the
above application. I would, of course, be very interested to know the
conclusion!
Dave
____________________________________________________________
FREE 3D EARTH SCREENSAVER - Watch the Earth right on your desktop!
Check it out at http://www.inbox.com/earth
Re: Big push on testing needed
In message <ns75cba24755.Bryn@yo.rk>
Bryn Evans <netsurf@bryork.freeuk.com> wrote:
>In a mad moment - Dave ec 20) mumbled :
>
>> Big news...
>
>> Current test (CI) builds are now release candidates. Yes, a new
>> release of NetSurf is imminent.
>
>> Please, everybody, download the latest test build (which will,
>> of course, change as bugs are found and fixed), give it a good
>> thrashing, and get your bug reports in.
>
>Thanks for the update.
>
>Using builds 3308 everything seems OK with and without JSript.
>
>This is both on RiscPC, RO 4.02 and RaPi (B) RO 5.23 (Dec 20)
>
> There is NO Pain!
>
>One anomaly has shown up. On the RiscPC, the AMAZON Log In page is
>fine, but on the RaPi the first box 'mobile or email' is limited
>to showing only the last 5 characters I enter, making it difficult
>to spot typos. Again this is the same for both JS on and off.
Please raise an issue on Mantis.
Dave
____________________________________________________________
FREE 3D MARINE AQUARIUM SCREENSAVER - Watch dolphins, sharks & orcas on your desktop!
Check it out at http://www.inbox.com/marineaquarium
Bryn Evans <netsurf@bryork.freeuk.com> wrote:
>In a mad moment - Dave ec 20) mumbled :
>
>> Big news...
>
>> Current test (CI) builds are now release candidates. Yes, a new
>> release of NetSurf is imminent.
>
>> Please, everybody, download the latest test build (which will,
>> of course, change as bugs are found and fixed), give it a good
>> thrashing, and get your bug reports in.
>
>Thanks for the update.
>
>Using builds 3308 everything seems OK with and without JSript.
>
>This is both on RiscPC, RO 4.02 and RaPi (B) RO 5.23 (Dec 20)
>
> There is NO Pain!
>
>One anomaly has shown up. On the RiscPC, the AMAZON Log In page is
>fine, but on the RaPi the first box 'mobile or email' is limited
>to showing only the last 5 characters I enter, making it difficult
>to spot typos. Again this is the same for both JS on and off.
Please raise an issue on Mantis.
Dave
____________________________________________________________
FREE 3D MARINE AQUARIUM SCREENSAVER - Watch dolphins, sharks & orcas on your desktop!
Check it out at http://www.inbox.com/marineaquarium
Re: Filtering webpages
In article <4ee4134855.gavin@wra1th.plus.com>,
Gavin Wraith <gavin@wra1th.plus.com> wrote:
> I seem to remember using a browser on an Acorn machine, many years ago,
> that gave you the option of not displaying images unless you
> specifically clicked on the icon that the browser used to indicate a
> missing image.
Fresco
I still have it on this Kinetic. The display menu had the following
options, which could be ticked or not as required
No Pictures
Antialias
Document colours (could turn off background colours to make content more
visible)
Frames
Tables
Stop animations (There was a particular website I recall that had a
black background with flashing "stars". It was horrid but
the flashing could be turned off with this option)
Set width
Controls (Buttons, URL bar, Status)
--
Stuart Winsor
Tools With A Mission
sending tools across the world
http://www.twam.co.uk/
Gavin Wraith <gavin@wra1th.plus.com> wrote:
> I seem to remember using a browser on an Acorn machine, many years ago,
> that gave you the option of not displaying images unless you
> specifically clicked on the icon that the browser used to indicate a
> missing image.
Fresco
I still have it on this Kinetic. The display menu had the following
options, which could be ticked or not as required
No Pictures
Antialias
Document colours (could turn off background colours to make content more
visible)
Frames
Tables
Stop animations (There was a particular website I recall that had a
black background with flashing "stars". It was horrid but
the flashing could be turned off with this option)
Set width
Controls (Buttons, URL bar, Status)
--
Stuart Winsor
Tools With A Mission
sending tools across the world
http://www.twam.co.uk/
Re: JavaScript enabled builds for Atari
Hello,
enabling JS for the atari builds was great :-)
Attached is a screenshot of another testpage from the NetSurf JS
Testsuite.
Things like <button onclick="dostuff();"> works fine.
document.getElementById() works, also document.write() works, at least
when called during the rendering phase.
The JS Prime-Test page took 160 sec. to load here, which is expected by
me.
document.getElementById('xyz').innerHTML = "New Content";
- does nothing.
document.getElementById('textarea').value = "New Content";
- does set the correct value, but the GUI is not refreshed. If that
would work, NetSurf would be able to provide some kind of JS-Calculator,
using the textarea as the display...
I can confirm that the ASCII-fractal is rendering fine, took 8.6 sec
with m68k-aranym (Clocking: 300Mhz).
Very promising, looking forward for some possibility to redraw the
page/HTML Elements :-)
Greets,
Ole
Am Montag, den 25.01.2016, 21:12 +0000 schrieb Peter Slegg:
> >Date: Sun, 24 Jan 2016 09:53:46 +0000
> >From: Michael Drake <tlsa@netsurf-browser.org>
> >Subject: Re: JavaScript enabled builds for Atari
> >To: netsurf-users@netsurf-browser.org
> >
> >
> >
> >On 23/01/16 23:06, Peter Slegg wrote:
> >
> >> Using Netsurf JS today it has crashed a few times.
> >> It isn't as stable as the last non-JS versions.
> >
> >How about when you use it with "enable_javascript:0"?
> >
> >Michael Drake http://www.netsurf-browser.org/
>
> With JS disabled it seems to be as stable as normal, no crashes
> since disabling JS yesterday and using JS to visit 20-30 pages.
>
> Is there a JS test page that can be used to check simple JS features ?
>
> Regards,
>
> Peter
>
>
>
>
>
>
enabling JS for the atari builds was great :-)
Attached is a screenshot of another testpage from the NetSurf JS
Testsuite.
Things like <button onclick="dostuff();"> works fine.
document.getElementById() works, also document.write() works, at least
when called during the rendering phase.
The JS Prime-Test page took 160 sec. to load here, which is expected by
me.
document.getElementById('xyz').innerHTML = "New Content";
- does nothing.
document.getElementById('textarea').value = "New Content";
- does set the correct value, but the GUI is not refreshed. If that
would work, NetSurf would be able to provide some kind of JS-Calculator,
using the textarea as the display...
I can confirm that the ASCII-fractal is rendering fine, took 8.6 sec
with m68k-aranym (Clocking: 300Mhz).
Very promising, looking forward for some possibility to redraw the
page/HTML Elements :-)
Greets,
Ole
Am Montag, den 25.01.2016, 21:12 +0000 schrieb Peter Slegg:
> >Date: Sun, 24 Jan 2016 09:53:46 +0000
> >From: Michael Drake <tlsa@netsurf-browser.org>
> >Subject: Re: JavaScript enabled builds for Atari
> >To: netsurf-users@netsurf-browser.org
> >
> >
> >
> >On 23/01/16 23:06, Peter Slegg wrote:
> >
> >> Using Netsurf JS today it has crashed a few times.
> >> It isn't as stable as the last non-JS versions.
> >
> >How about when you use it with "enable_javascript:0"?
> >
> >Michael Drake http://www.netsurf-browser.org/
>
> With JS disabled it seems to be as stable as normal, no crashes
> since disabling JS yesterday and using JS to visit 20-30 pages.
>
> Is there a JS test page that can be used to check simple JS features ?
>
> Regards,
>
> Peter
>
>
>
>
>
>
Re: NetSurf progress
In article <41035CF02C7.0000007Bdave@davehigton.me.uk>,
Dave Higton <dave@davehigton.me.uk> wrote:
> NetSurf is fully capable of creating bug reports using the Mantis
> issue tracker, including attaching files and adding notes. If
> you have a problem, ask on this list for help.
This morning I went through the process of signing up to the bug-tracker,
I'm still awaiting the confirmatory email to complete the process.
Though I used my private email not this address, which is kept for mailing
list ony.
--
Stuart Winsor
Tools With A Mission
sending tools across the world
http://www.twam.co.uk/
Dave Higton <dave@davehigton.me.uk> wrote:
> NetSurf is fully capable of creating bug reports using the Mantis
> issue tracker, including attaching files and adding notes. If
> you have a problem, ask on this list for help.
This morning I went through the process of signing up to the bug-tracker,
I'm still awaiting the confirmatory email to complete the process.
Though I used my private email not this address, which is kept for mailing
list ony.
--
Stuart Winsor
Tools With A Mission
sending tools across the world
http://www.twam.co.uk/
/ Archive 23:12 white-on-grey
cj wrote on 26 Jan:
> ... and what about the latest issue of Archive where, on page 3 you
> have white text on a light grey background???
Yes; that was bad. The contrast looked fine in the past, and looked
fine this time too on screen, but unfortunately my printshop used a
new printer. Before next issue I'll ask him to put through a test
sheet with samples of the whole range of tints.
--
Jim Nagel www.archivemag.co.uk
> ... and what about the latest issue of Archive where, on page 3 you
> have white text on a light grey background???
Yes; that was bad. The contrast looked fine in the past, and looked
fine this time too on screen, but unfortunately my printshop used a
new printer. Before next issue I'll ask him to put through a test
sheet with samples of the whole range of tints.
--
Jim Nagel www.archivemag.co.uk
Re: Filtering webpages
In message <55480e2bfeLists@Torrens.org.uk>
"Richard Torrens (lists)" <Lists@Torrens.org.uk> wrote:
>NS has choices:
>Hide advertisements
>Disable pop-up windows
>Disable JavaScript
>
>These, I would guess, tell it to ignore such content. It is then filtering
>exactly as you seem to want!
These are desirable but not general enough for my purposes.
>So I guess you are asking for a throwback icon, to throw the source out to
>an external editor, where it can be processed and returned back to NS?
Exactly.
I seem to remember using a browser on an Acorn machine, many years ago, that
gave you the option of not displaying images unless you specifically clicked
on the icon that the browser used to indicate a missing image.
I would also like to be able to discriminate content by source URL and to give permissions
for which should be blocked or which allowed through. But I suspect that this requires
filtering at the packet level. My feeling is that the majority of the public
have little idea about what is going on when they use the internet, as opposed
to the businesses which would like to exploit their ignorance. Web savvy programmers
produce the websites and also the browsers. I would like to see users take back, or
be given back (by appropriate tools) more control over this predator/prey scenario.
It is a question of freedom.
--
Gavin Wraith (gavin@wra1th.plus.com)
Home page: http://www.wra1th.plus.com/
"Richard Torrens (lists)" <Lists@Torrens.org.uk> wrote:
>NS has choices:
>Hide advertisements
>Disable pop-up windows
>Disable JavaScript
>
>These, I would guess, tell it to ignore such content. It is then filtering
>exactly as you seem to want!
These are desirable but not general enough for my purposes.
>So I guess you are asking for a throwback icon, to throw the source out to
>an external editor, where it can be processed and returned back to NS?
Exactly.
I seem to remember using a browser on an Acorn machine, many years ago, that
gave you the option of not displaying images unless you specifically clicked
on the icon that the browser used to indicate a missing image.
I would also like to be able to discriminate content by source URL and to give permissions
for which should be blocked or which allowed through. But I suspect that this requires
filtering at the packet level. My feeling is that the majority of the public
have little idea about what is going on when they use the internet, as opposed
to the businesses which would like to exploit their ignorance. Web savvy programmers
produce the websites and also the browsers. I would like to see users take back, or
be given back (by appropriate tools) more control over this predator/prey scenario.
It is a question of freedom.
--
Gavin Wraith (gavin@wra1th.plus.com)
Home page: http://www.wra1th.plus.com/
Re: 2 sites that don't!
In message <09d3064855.jim@abbeypress.net>
Jim Nagel <netsurf@abbeypress.co.uk> wrote:
>Did that on the first web page. Got a security warning from StrongEd
>about "process"; I said "allow always".
>Then "File name '.in' not recognised".
Sounds like !StrongED$ScrapDir has not been set. See the line
Set StrongED$Script_Outfile <StrongED$ScrapDir>.in
in !StrongED.Defaults.Tools.!ScriptSED.Tools.Despatch .
--
Gavin Wraith (gavin@wra1th.plus.com)
Home page: http://www.wra1th.plus.com/
Jim Nagel <netsurf@abbeypress.co.uk> wrote:
>Did that on the first web page. Got a security warning from StrongEd
>about "process"; I said "allow always".
>Then "File name '.in' not recognised".
Sounds like !StrongED$ScrapDir has not been set. See the line
Set StrongED$Script_Outfile <StrongED$ScrapDir>.in
in !StrongED.Defaults.Tools.!ScriptSED.Tools.Despatch .
--
Gavin Wraith (gavin@wra1th.plus.com)
Home page: http://www.wra1th.plus.com/
Re: Filtering webpages
In article <c2cd074855.gavin@wra1th.plus.com>,
Gavin Wraith <gavin@wra1th.plus.com> wrote:
> So many web pages these days are crammed with stuff put in by
> advertisers or third parties which upset the viewing experience,
> or which NetSurf is unable to render properly. The individual
> user is increasingly going to need tools to fight back, with
> which they can emasculate the page of all the clutter added in
> for statistical purposes, to add unwanted advertisements and
> pop-ups. The days of innocence are long over, so filtering of
> web pages is now a necessity.
NS has choices:
Hide advertisements
Disable pop-up windows
Disable JavaScript
These, I would guess, tell it to ignore such content. It is then filtering
exactly as you seem to want!
So I guess you are asking for a throwback icon, to throw the source out to
an external editor, where it can be processed and returned back to NS?
This is perfectly possible within RISC OS. But I doubt that it is possible
in other OSes.
I hope the developers will contradict me and say it's totally possible -
but I have doubts!
--
Richard Torrens.
http://www.Torrens.org.uk for genealogy, natural history, wild food, walks, cats
and more!
Gavin Wraith <gavin@wra1th.plus.com> wrote:
> So many web pages these days are crammed with stuff put in by
> advertisers or third parties which upset the viewing experience,
> or which NetSurf is unable to render properly. The individual
> user is increasingly going to need tools to fight back, with
> which they can emasculate the page of all the clutter added in
> for statistical purposes, to add unwanted advertisements and
> pop-ups. The days of innocence are long over, so filtering of
> web pages is now a necessity.
NS has choices:
Hide advertisements
Disable pop-up windows
Disable JavaScript
These, I would guess, tell it to ignore such content. It is then filtering
exactly as you seem to want!
So I guess you are asking for a throwback icon, to throw the source out to
an external editor, where it can be processed and returned back to NS?
This is perfectly possible within RISC OS. But I doubt that it is possible
in other OSes.
I hope the developers will contradict me and say it's totally possible -
but I have doubts!
--
Richard Torrens.
http://www.Torrens.org.uk for genealogy, natural history, wild food, walks, cats
and more!
Re: 2 sites that don't!
In article <09d3064855.jim@abbeypress.net>,
Jim Nagel <netsurf@abbeypress.co.uk> wrote:
> Have often wished for a similar thing to override bothersome CSS,
> particularly CSS that specifies tiny grey text (which seems to be a
> current fashion among some designers with younger eyes than mine).
... and what about the latest issue of Archive where, on page 3 you
have white text on a light ngrey background???
--
Chris Johnson
Jim Nagel <netsurf@abbeypress.co.uk> wrote:
> Have often wished for a similar thing to override bothersome CSS,
> particularly CSS that specifies tiny grey text (which seems to be a
> current fashion among some designers with younger eyes than mine).
... and what about the latest issue of Archive where, on page 3 you
have white text on a light ngrey background???
--
Chris Johnson
Filtering webpages
It is some years ago that Fred Graute added scriptability
to StrongED, using the 'apply' icon. It enables one to
alter what is displayed in a StrongED window by dragging
a script to that icon.
There are often occasions when one would like to massage
webpages displayed in a browser. At present one has
to save out the displayed page's textual source to
StrongED, drag in the script to do the massaging,
save the result as an HTML file and then display that
result in the browser. It would be nice if this process
could be simplified and done entirely within the browser,
without having to-and-fro between browser and StrongED.
A typically useful script is one that removes all the
source starting <script ... up to a matching </script>.
So many web pages these days are crammed with stuff put in by
advertisers or third parties which upset the viewing experience,
or which NetSurf is unable to render properly. The individual
user is increasingly going to need tools to fight back, with
which they can emasculate the page of all the clutter added in
for statistical purposes, to add unwanted advertisements and
pop-ups. The days of innocence are long over, so filtering of
web pages is now a necessity.
I am not sufficiently au fait with the sources of NetSurf to
know if there are hooks available for scriptability beyond
saving out the page being displayed. I would love to hear what
other NetSurf users think.
--
Gavin Wraith (gavin@wra1th.plus.com)
Home page: http://www.wra1th.plus.com/
to StrongED, using the 'apply' icon. It enables one to
alter what is displayed in a StrongED window by dragging
a script to that icon.
There are often occasions when one would like to massage
webpages displayed in a browser. At present one has
to save out the displayed page's textual source to
StrongED, drag in the script to do the massaging,
save the result as an HTML file and then display that
result in the browser. It would be nice if this process
could be simplified and done entirely within the browser,
without having to-and-fro between browser and StrongED.
A typically useful script is one that removes all the
source starting <script ... up to a matching </script>.
So many web pages these days are crammed with stuff put in by
advertisers or third parties which upset the viewing experience,
or which NetSurf is unable to render properly. The individual
user is increasingly going to need tools to fight back, with
which they can emasculate the page of all the clutter added in
for statistical purposes, to add unwanted advertisements and
pop-ups. The days of innocence are long over, so filtering of
web pages is now a necessity.
I am not sufficiently au fait with the sources of NetSurf to
know if there are hooks available for scriptability beyond
saving out the page being displayed. I would love to hear what
other NetSurf users think.
--
Gavin Wraith (gavin@wra1th.plus.com)
Home page: http://www.wra1th.plus.com/
Re: 2 sites that don't!
Gavin Wraith wrote on 26 Jan:
> Using it on the text of the first webpage produces source for a leaner
> webpage that reveals the hidden text. Here is the procedure:
> 1. Page->View Source in NetSurf
> 2. Save to some scrap directory.
> 3. Shift-click to load into StrongED
> 4. Shift-Drag noscript to apply icon.
> 5. Save the result and change its type to HTML.
Did that on the first web page. Got a security warning from StrongEd
about "process"; I said "allow always".
Then "File name '.in' not recognised".
Maybe my version of Lua is too old? !BOOT.Resources.!Lua.version says
5.70 (2013-08-16).
Potentially sounds like a very welcome utility.
Have often wished for a similar thing to override bothersome CSS,
particularly CSS that specifies tiny grey text (which seems to be a
current fashion among some designers with younger eyes than mine).
--
Jim Nagel www.archivemag.co.uk
> Using it on the text of the first webpage produces source for a leaner
> webpage that reveals the hidden text. Here is the procedure:
> 1. Page->View Source in NetSurf
> 2. Save to some scrap directory.
> 3. Shift-click to load into StrongED
> 4. Shift-Drag noscript to apply icon.
> 5. Save the result and change its type to HTML.
Did that on the first web page. Got a security warning from StrongEd
about "process"; I said "allow always".
Then "File name '.in' not recognised".
Maybe my version of Lua is too old? !BOOT.Resources.!Lua.version says
5.70 (2013-08-16).
Potentially sounds like a very welcome utility.
Have often wished for a similar thing to override bothersome CSS,
particularly CSS that specifies tiny grey text (which seems to be a
current fashion among some designers with younger eyes than mine).
--
Jim Nagel www.archivemag.co.uk
Re: 2 sites that don't!
In article <145dff4755.gavin@wra1th.plus.com>,
Gavin Wraith <gavin@wra1th.plus.com> wrote:
> I have a useful little Lua script, called noscript, for use with
> StrongED that strips out all the stuff between matching <script ... >
> and </script> tags. Here it is:
Useful: I have on occasion loked for the tags and deleted everything
between - it can be done with a StrongED search and replace. But the
script is easier.
--
Richard Torrens.
http://www.Torrens.org.uk for genealogy, natural history, wild food, walks, cats
and more!
Gavin Wraith <gavin@wra1th.plus.com> wrote:
> I have a useful little Lua script, called noscript, for use with
> StrongED that strips out all the stuff between matching <script ... >
> and </script> tags. Here it is:
Useful: I have on occasion loked for the tags and deleted everything
between - it can be done with a StrongED search and replace. But the
script is easier.
--
Richard Torrens.
http://www.Torrens.org.uk for genealogy, natural history, wild food, walks, cats
and more!
Re: 2 sites that don't!
In article <a777fe4755.GrahameParish@grahame.parish>,
Grahame Parish <maillist.parish@millers-way.net> wrote:
> Not related to the problem with the sites, but I confirm that the
> feeders work very well. We have several cats with different dietary
> requirements.
Thanks - but any other discussion, off-list I think!
--
Richard Torrens.
http://www.Torrens.org.uk for genealogy, natural history, wild food, walks, cats
and more!
Grahame Parish <maillist.parish@millers-way.net> wrote:
> Not related to the problem with the sites, but I confirm that the
> feeders work very well. We have several cats with different dietary
> requirements.
Thanks - but any other discussion, off-list I think!
--
Richard Torrens.
http://www.Torrens.org.uk for genealogy, natural history, wild food, walks, cats
and more!
Re: 2 sites that don't!
In message <5547fb845eLists@Torrens.org.uk>
"Richard Torrens (lists)" <Lists@Torrens.org.uk> wrote:
>http://www.catbehaviourist.com/sure-feed-microchip-pet-feeder-review/
>
>http://www.zooplus.co.uk/shop/cats/cat_bowls_feeders/feeders/programmable/479534?gclid=CImfsqu3x8oCFUORGwodHCUFYQ
>
>Both display very badly. First site all text is hidden. The source is very
>over-complicated, so I can't see why.
I have a useful little Lua script, called noscript, for use with StrongED that strips out all the
stuff between matching <script ... > and </script> tags. Here it is:
#! lua
io.input (arg[1])
local text = io.read "*all"
io.input ( )
local pat = "<script[^<]*</script>"
io.write ((text:gsub (pat, "")))
Using it on the text of the first webpage produces source for a leaner webpage
that reveals the hidden text. Here is the procedure:
1. Page->View Source in NetSurf
2. Save to some scrap directory.
3. Shift-click to load into StrongED
4. Shift-Drag noscript to apply icon.
5. Save the result and change its type to HTML.
It would be nice if NetSurf could have an apply icon like StrongED, which
could make a single drag action suffice.
--
Gavin Wraith (gavin@wra1th.plus.com)
Home page: http://www.wra1th.plus.com/
"Richard Torrens (lists)" <Lists@Torrens.org.uk> wrote:
>http://www.catbehaviourist.com/sure-feed-microchip-pet-feeder-review/
>
>http://www.zooplus.co.uk/shop/cats/cat_bowls_feeders/feeders/programmable/479534?gclid=CImfsqu3x8oCFUORGwodHCUFYQ
>
>Both display very badly. First site all text is hidden. The source is very
>over-complicated, so I can't see why.
I have a useful little Lua script, called noscript, for use with StrongED that strips out all the
stuff between matching <script ... > and </script> tags. Here it is:
#! lua
io.input (arg[1])
local text = io.read "*all"
io.input ( )
local pat = "<script[^<]*</script>"
io.write ((text:gsub (pat, "")))
Using it on the text of the first webpage produces source for a leaner webpage
that reveals the hidden text. Here is the procedure:
1. Page->View Source in NetSurf
2. Save to some scrap directory.
3. Shift-click to load into StrongED
4. Shift-Drag noscript to apply icon.
5. Save the result and change its type to HTML.
It would be nice if NetSurf could have an apply icon like StrongED, which
could make a single drag action suffice.
--
Gavin Wraith (gavin@wra1th.plus.com)
Home page: http://www.wra1th.plus.com/
Re: 2 sites that don't!
In message <5547fb845eLists@Torrens.org.uk>
"Richard Torrens (lists)" <Lists@Torrens.org.uk> wrote:
> http://www.catbehaviourist.com/sure-feed-microchip-pet-feeder-review/
> http://www.zooplus.co.uk/shop/cats/cat_bowls_feeders/feeders/programma
> ble/479534?gclid=CImfsqu3x8oCFUORGwodHCUFYQ
> Both display very badly. First site all text is hidden. The source is very
> over-complicated, so I can't see why.
Not related to the problem with the sites, but I confirm that the
feeders work very well. We have several cats with different dietary
requirements.
--
Grahame Parish
"Richard Torrens (lists)" <Lists@Torrens.org.uk> wrote:
> http://www.catbehaviourist.com/sure-feed-microchip-pet-feeder-review/
> http://www.zooplus.co.uk/shop/cats/cat_bowls_feeders/feeders/programma
> ble/479534?gclid=CImfsqu3x8oCFUORGwodHCUFYQ
> Both display very badly. First site all text is hidden. The source is very
> over-complicated, so I can't see why.
Not related to the problem with the sites, but I confirm that the
feeders work very well. We have several cats with different dietary
requirements.
--
Grahame Parish
2 sites that don't!
http://www.catbehaviourist.com/sure-feed-microchip-pet-feeder-review/
http://www.zooplus.co.uk/shop/cats/cat_bowls_feeders/feeders/programmable/479534?gclid=CImfsqu3x8oCFUORGwodHCUFYQ
Both display very badly. First site all text is hidden. The source is very
over-complicated, so I can't see why.
--
Richard Torrens.
http://www.Torrens.org.uk for genealogy, natural history, wild food, walks, cats
and more!
http://www.zooplus.co.uk/shop/cats/cat_bowls_feeders/feeders/programmable/479534?gclid=CImfsqu3x8oCFUORGwodHCUFYQ
Both display very badly. First site all text is hidden. The source is very
over-complicated, so I can't see why.
--
Richard Torrens.
http://www.Torrens.org.uk for genealogy, natural history, wild food, walks, cats
and more!
Re: Big push on testing needed
In article <3bc72c4755.DaveMeUK@my.inbox.com>,
Dave Higton <dave@davehigton.me.uk> wrote:
> Please, everybody, download the latest test build (which will,
> of course, change as bugs are found and fixed), give it a good
> thrashing, and get your bug reports in.
A site I use a lot is
http://cpc.farnell.com/
Although it displays quite nicely, the "search" box, which should be within
the broad blue banner at the top, to the right of "All products", isn't
visible - is this fixable?
#3312
--
Stuart Winsor
Tools With A Mission
sending tools across the world
http://www.twam.co.uk/
Dave Higton <dave@davehigton.me.uk> wrote:
> Please, everybody, download the latest test build (which will,
> of course, change as bugs are found and fixed), give it a good
> thrashing, and get your bug reports in.
A site I use a lot is
http://cpc.farnell.com/
Although it displays quite nicely, the "search" box, which should be within
the broad blue banner at the top, to the right of "All products", isn't
visible - is this fixable?
#3312
--
Stuart Winsor
Tools With A Mission
sending tools across the world
http://www.twam.co.uk/
Re: Latest builds missing.
Michael Drake, on 26 Jan, wrote:
>
>
> On 26/01/16 07:49, David Pitt wrote:
> > Currently the Changes page shows #3312 as the latest successful build
> > but the Downloads page only goes up to #3307 for all builds.
>
> The downloads are still there, but "json" has been removed from the
> filenames. http://ci.netsurf-browser.org/builds/riscos/?C=M;O=D
So they are, just at the top of the page now.
Thanks.
--
David Pitt
>
>
> On 26/01/16 07:49, David Pitt wrote:
> > Currently the Changes page shows #3312 as the latest successful build
> > but the Downloads page only goes up to #3307 for all builds.
>
> The downloads are still there, but "json" has been removed from the
> filenames. http://ci.netsurf-browser.org/builds/riscos/?C=M;O=D
So they are, just at the top of the page now.
Thanks.
--
David Pitt
Re: Latest builds missing.
On 26 Jan 2016 David Pitt <pittdj@pittdj.co.uk> wrote:
> Currently the Changes page shows #3312 as the latest successful build but
> the Downloads page only goes up to #3307 for all builds.
> http://ci.netsurf-browser.org/jenkins/job/netsurf/changes
> http://ci.netsurf-browser.org/builds/riscos/
I've just downloaded #3312 using Frank's version of Fetch_NS.
Best wishes,
Peter.
--
Peter Young (zfc Os) and family
Prestbury, Cheltenham, Glos. GL52, England
http://pnyoung.orpheusweb.co.uk
pnyoung@ormail.co.uk
> Currently the Changes page shows #3312 as the latest successful build but
> the Downloads page only goes up to #3307 for all builds.
> http://ci.netsurf-browser.org/jenkins/job/netsurf/changes
> http://ci.netsurf-browser.org/builds/riscos/
I've just downloaded #3312 using Frank's version of Fetch_NS.
Best wishes,
Peter.
--
Peter Young (zfc Os) and family
Prestbury, Cheltenham, Glos. GL52, England
http://pnyoung.orpheusweb.co.uk
pnyoung@ormail.co.uk
Re: Latest builds missing.
On 26/01/16 07:49, David Pitt wrote:
> Currently the Changes page shows #3312 as the latest successful build but
> the Downloads page only goes up to #3307 for all builds.
The downloads are still there, but "json" has been removed from the
filenames. http://ci.netsurf-browser.org/builds/riscos/?C=M;O=D
--
Michael Drake http://www.netsurf-browser.org/
> Currently the Changes page shows #3312 as the latest successful build but
> the Downloads page only goes up to #3307 for all builds.
The downloads are still there, but "json" has been removed from the
filenames. http://ci.netsurf-browser.org/builds/riscos/?C=M;O=D
--
Michael Drake http://www.netsurf-browser.org/
Monday, 25 January 2016
Latest builds missing.
Currently the Changes page shows #3312 as the latest successful build but
the Downloads page only goes up to #3307 for all builds.
http://ci.netsurf-browser.org/jenkins/job/netsurf/changes
http://ci.netsurf-browser.org/builds/riscos/
--
David Pitt
the Downloads page only goes up to #3307 for all builds.
http://ci.netsurf-browser.org/jenkins/job/netsurf/changes
http://ci.netsurf-browser.org/builds/riscos/
--
David Pitt
Re: Big push on testing needed
On 25/01/16 21:43, Dave Higton wrote:
> One other thing: the CI builds are coming out with logging
> enabled. This will be changed for the release version.
All RISC OS releases and CI builds have had logging enabled.
I would not suggest turning it off. If something does
go wrong there would be nothing to help us fix the problem
In the very rare cases when we've logged so much that it
makes a significant impact on performance it's because we're
aware something is buggy and want to gather the info needed
to help fix it.
The reason I suggested turning logging off to you last night
was to determine what the problem was without needing to
rebuild anything.
Since then, I've disabled the excessive JavaScript debug
output, so there should be no reason for users to do
anything unusual with the !Run file.
Cheers,
--
Michael Drake http://www.netsurf-browser.org/
> One other thing: the CI builds are coming out with logging
> enabled. This will be changed for the release version.
All RISC OS releases and CI builds have had logging enabled.
I would not suggest turning it off. If something does
go wrong there would be nothing to help us fix the problem
In the very rare cases when we've logged so much that it
makes a significant impact on performance it's because we're
aware something is buggy and want to gather the info needed
to help fix it.
The reason I suggested turning logging off to you last night
was to determine what the problem was without needing to
rebuild anything.
Since then, I've disabled the excessive JavaScript debug
output, so there should be no reason for users to do
anything unusual with the !Run file.
Cheers,
--
Michael Drake http://www.netsurf-browser.org/
Re: Big push on testing needed
In message <554757c729bbailey@argonet.co.uk>
Brian <bbailey@argonet.co.uk> wrote:
>In article <3bc72c4755.DaveMeUK@my.inbox.com>,
> Dave Higton <dave@davehigton.me.uk> wrote:
>> Big news...
>
>> Current test (CI) builds are now release candidates. Yes, a new
>> release of NetSurf is imminent.
>
>> Please, everybody, download the latest test build (which will,
>> of course, change as bugs are found and fixed), give it a good
>> thrashing, and get your bug reports in.
>
>> Please also note that, since it's now close to release time, the
>> Javascript setting in Choices->Content is obeyed (and has been
>> for a couple of days or so now).
>
>> Dave
>
>Fetching websites seems to storm along at a rate of knots. Has something
>been done to NetSurf to enable this?
It has been suggested that the Unixlib fixes may be responsible.
If you go back far enough (before 2015 early November), the
regular expression parser was sometimes responsible for an
unbelievably high proportion of page load times. (At the
developer weekend last autumn, I saw an example where it took
90% of the CPU time.)
One other thing: the CI builds are coming out with logging
enabled. This will be changed for the release version. So,
if you want NS to go even faster, turn logging off (you'll
see the setting near the top of the !Run file). Of course:
1) you'll have to turn it back on again to get a log to report
a bug;
2) you'll want to turn it off again for each new CI version
you install.
It's a bit of an extreme example, but a Javascript test I
ran last night took over 6 minutes with logging enabled (it
created a 25 MB file) but 20 seconds with logging disabled.
Dave
____________________________________________________________
Can't remember your password? Do you need a strong and secure password?
Use Password manager! It stores your passwords & protects your account.
Check it out at http://mysecurelogon.com/manager
Brian <bbailey@argonet.co.uk> wrote:
>In article <3bc72c4755.DaveMeUK@my.inbox.com>,
> Dave Higton <dave@davehigton.me.uk> wrote:
>> Big news...
>
>> Current test (CI) builds are now release candidates. Yes, a new
>> release of NetSurf is imminent.
>
>> Please, everybody, download the latest test build (which will,
>> of course, change as bugs are found and fixed), give it a good
>> thrashing, and get your bug reports in.
>
>> Please also note that, since it's now close to release time, the
>> Javascript setting in Choices->Content is obeyed (and has been
>> for a couple of days or so now).
>
>> Dave
>
>Fetching websites seems to storm along at a rate of knots. Has something
>been done to NetSurf to enable this?
It has been suggested that the Unixlib fixes may be responsible.
If you go back far enough (before 2015 early November), the
regular expression parser was sometimes responsible for an
unbelievably high proportion of page load times. (At the
developer weekend last autumn, I saw an example where it took
90% of the CPU time.)
One other thing: the CI builds are coming out with logging
enabled. This will be changed for the release version. So,
if you want NS to go even faster, turn logging off (you'll
see the setting near the top of the !Run file). Of course:
1) you'll have to turn it back on again to get a log to report
a bug;
2) you'll want to turn it off again for each new CI version
you install.
It's a bit of an extreme example, but a Javascript test I
ran last night took over 6 minutes with logging enabled (it
created a 25 MB file) but 20 seconds with logging disabled.
Dave
____________________________________________________________
Can't remember your password? Do you need a strong and secure password?
Use Password manager! It stores your passwords & protects your account.
Check it out at http://mysecurelogon.com/manager
Re: Big push on testing needed
In message <57e4a64755.iyojohn@rickman.argonet.co.uk>
John Rickman Iyonix <rickman@argonet.co.uk> wrote:
>John Rickman Iyonix wrote
>
>> http://rickman.orpheusweb.co.uk/testing/test5.html
>
>> I have now traced the problem to an error in my code. I need to fix
>> the problem and try again.
>
>My javascript is now working. the problem was that the new interpreter
>does not like html comments between the <script> and </script> tags.
>
> <!DOCTYPE HTML>
> <html><head> <title>Bakehouse-Cyber</title></head><body>
> <script>
> <!-- here is some javascript -->
> document.write("hello world");
> </script>
> </body> </html>
>
>As far as I know javascript should ignore html comments and the
>javascript validator does not flag them as errors
>
> http://www.javascriptlint.com/online_lint.php
My O'Reilly Javascript book tells me that Javascript supports C-
style and C++-style comments. The only mention of HTML-style
comments relates to really, really, old browsers.
I think your code above is genuinely wrong, although I guess
it might be supported by some browsers that are deliberately
written to be tolerant of errors.
I'd recommens that you change your comment line to:
// here is some javascript
Dave
____________________________________________________________
FREE ONLINE PHOTOSHARING - Share your photos online with your friends and family!
Visit http://www.inbox.com/photosharing to find out more!
John Rickman Iyonix <rickman@argonet.co.uk> wrote:
>John Rickman Iyonix wrote
>
>> http://rickman.orpheusweb.co.uk/testing/test5.html
>
>> I have now traced the problem to an error in my code. I need to fix
>> the problem and try again.
>
>My javascript is now working. the problem was that the new interpreter
>does not like html comments between the <script> and </script> tags.
>
> <!DOCTYPE HTML>
> <html><head> <title>Bakehouse-Cyber</title></head><body>
> <script>
> <!-- here is some javascript -->
> document.write("hello world");
> </script>
> </body> </html>
>
>As far as I know javascript should ignore html comments and the
>javascript validator does not flag them as errors
>
> http://www.javascriptlint.com/online_lint.php
My O'Reilly Javascript book tells me that Javascript supports C-
style and C++-style comments. The only mention of HTML-style
comments relates to really, really, old browsers.
I think your code above is genuinely wrong, although I guess
it might be supported by some browsers that are deliberately
written to be tolerant of errors.
I'd recommens that you change your comment line to:
// here is some javascript
Dave
____________________________________________________________
FREE ONLINE PHOTOSHARING - Share your photos online with your friends and family!
Visit http://www.inbox.com/photosharing to find out more!
#3305 JS on
First time for several years: HMRC on-line now requires JS.
The site complains that JS is off (I had JS on) but it works
nevertheless. I have no idea whether it worked with earlier builds, but
very satisfying to have it work with latest build.
Thanks a lot.
Gerald
The site complains that JS is off (I had JS on) but it works
nevertheless. I have no idea whether it worked with earlier builds, but
very satisfying to have it work with latest build.
Thanks a lot.
Gerald
Re: JavaScript enabled builds for Atari
>Date: Sun, 24 Jan 2016 09:53:46 +0000
>From: Michael Drake <tlsa@netsurf-browser.org>
>Subject: Re: JavaScript enabled builds for Atari
>To: netsurf-users@netsurf-browser.org
>
>
>
>On 23/01/16 23:06, Peter Slegg wrote:
>
>> Using Netsurf JS today it has crashed a few times.
>> It isn't as stable as the last non-JS versions.
>
>How about when you use it with "enable_javascript:0"?
>
>Michael Drake http://www.netsurf-browser.org/
With JS disabled it seems to be as stable as normal, no crashes
since disabling JS yesterday and using JS to visit 20-30 pages.
Is there a JS test page that can be used to check simple JS features ?
Regards,
Peter
>From: Michael Drake <tlsa@netsurf-browser.org>
>Subject: Re: JavaScript enabled builds for Atari
>To: netsurf-users@netsurf-browser.org
>
>
>
>On 23/01/16 23:06, Peter Slegg wrote:
>
>> Using Netsurf JS today it has crashed a few times.
>> It isn't as stable as the last non-JS versions.
>
>How about when you use it with "enable_javascript:0"?
>
>Michael Drake http://www.netsurf-browser.org/
With JS disabled it seems to be as stable as normal, no crashes
since disabling JS yesterday and using JS to visit 20-30 pages.
Is there a JS test page that can be used to check simple JS features ?
Regards,
Peter
Re: Big push on testing needed
John Rickman Iyonix wrote
> http://rickman.orpheusweb.co.uk/testing/test5.html
> I have now traced the problem to an error in my code. I need to fix
> the problem and try again.
My javascript is now working. the problem was that the new interpreter
does not like html comments between the <script> and </script> tags.
<!DOCTYPE HTML>
<html><head> <title>Bakehouse-Cyber</title></head><body>
<script>
<!-- here is some javascript -->
document.write("hello world");
</script>
</body> </html>
As far as I know javascript should ignore html comments and the
javascript validator does not flag them as errors
http://www.javascriptlint.com/online_lint.php
John
--
John Rickman - http://rickman.orpheusweb.co.uk/lynx
> http://rickman.orpheusweb.co.uk/testing/test5.html
> I have now traced the problem to an error in my code. I need to fix
> the problem and try again.
My javascript is now working. the problem was that the new interpreter
does not like html comments between the <script> and </script> tags.
<!DOCTYPE HTML>
<html><head> <title>Bakehouse-Cyber</title></head><body>
<script>
<!-- here is some javascript -->
document.write("hello world");
</script>
</body> </html>
As far as I know javascript should ignore html comments and the
javascript validator does not flag them as errors
http://www.javascriptlint.com/online_lint.php
John
--
John Rickman - http://rickman.orpheusweb.co.uk/lynx
Re: Big push on testing needed
Dave Higton wrote
>> I am happy to download every day, or whenever a new version is
>> available.
>> Is there any information available about the current state of
>> javascript?
> That's impossible to answer in any way that is both simple and
> meaningful.
> The Javascript interpreter was replaced with a different one
> a few months ago. The decision was taken not to make a new
> stable release of NS until all the Javascript features of the
> previous interpreter are present again. However, some other
> JS features are also present.
> The JS implementation is still far from complete, though. If
> you need a RISC OS browser with complete (-ish) Javascript at
> the moment, you'll have to use Otter or QupZilla, and put up
> with the slower speeds.
> To look at your question from a different angle: is there a
> particular Javascript feature that you need?
Hello Dave, thanks for the information. It is sufficient for my
purpose. When the JS interepreter was changed all of my JS example
code stopped working. I believed, mistakenly it turns out, that the JS
support in NetSurf was a sort of place holder. I have been waiting for
some sort of announcment that it had been re-implemented.
I have now looked more closely at my code and believe I have found the
problem.
This link which contains various javascript examples used to work in
NetSurf and now does not. The fact that it works on other browsers
led me to believe that NetSurf was at fault:-
http://rickman.orpheusweb.co.uk/testing/test5.html
I have now traced the problem to an error in my code. I need to fix
the problem and try again.
John
--
John Rickman - http://rickman.orpheusweb.co.uk/lynx
it transcends plausibility, it's a fact. david mercer
>> I am happy to download every day, or whenever a new version is
>> available.
>> Is there any information available about the current state of
>> javascript?
> That's impossible to answer in any way that is both simple and
> meaningful.
> The Javascript interpreter was replaced with a different one
> a few months ago. The decision was taken not to make a new
> stable release of NS until all the Javascript features of the
> previous interpreter are present again. However, some other
> JS features are also present.
> The JS implementation is still far from complete, though. If
> you need a RISC OS browser with complete (-ish) Javascript at
> the moment, you'll have to use Otter or QupZilla, and put up
> with the slower speeds.
> To look at your question from a different angle: is there a
> particular Javascript feature that you need?
Hello Dave, thanks for the information. It is sufficient for my
purpose. When the JS interepreter was changed all of my JS example
code stopped working. I believed, mistakenly it turns out, that the JS
support in NetSurf was a sort of place holder. I have been waiting for
some sort of announcment that it had been re-implemented.
I have now looked more closely at my code and believe I have found the
problem.
This link which contains various javascript examples used to work in
NetSurf and now does not. The fact that it works on other browsers
led me to believe that NetSurf was at fault:-
http://rickman.orpheusweb.co.uk/testing/test5.html
I have now traced the problem to an error in my code. I need to fix
the problem and try again.
John
--
John Rickman - http://rickman.orpheusweb.co.uk/lynx
it transcends plausibility, it's a fact. david mercer
Re: Big push on testing needed
In a mad moment - Dave ec 20) mumbled :
> Big news...
> Current test (CI) builds are now release candidates. Yes, a new
> release of NetSurf is imminent.
> Please, everybody, download the latest test build (which will,
> of course, change as bugs are found and fixed), give it a good
> thrashing, and get your bug reports in.
Thanks for the update.
Using builds 3308 everything seems OK with and without JSript.
This is both on RiscPC, RO 4.02 and RaPi (B) RO 5.23 (Dec 20)
There is NO Pain!
One anomaly has shown up. On the RiscPC, the AMAZON Log In page is
fine, but on the RaPi the first box 'mobile or email' is limited
to showing only the last 5 characters I enter, making it difficult
to spot typos. Again this is the same for both JS on and off.
--
|) [
|)ryn [vans mail to - BrynEvans@bryork.freeuk.com
> Big news...
> Current test (CI) builds are now release candidates. Yes, a new
> release of NetSurf is imminent.
> Please, everybody, download the latest test build (which will,
> of course, change as bugs are found and fixed), give it a good
> thrashing, and get your bug reports in.
Thanks for the update.
Using builds 3308 everything seems OK with and without JSript.
This is both on RiscPC, RO 4.02 and RaPi (B) RO 5.23 (Dec 20)
There is NO Pain!
One anomaly has shown up. On the RiscPC, the AMAZON Log In page is
fine, but on the RaPi the first box 'mobile or email' is limited
to showing only the last 5 characters I enter, making it difficult
to spot typos. Again this is the same for both JS on and off.
--
|) [
|)ryn [vans mail to - BrynEvans@bryork.freeuk.com
Google News display silly
If you go to news.google.co.uk/news you get a list of news items most of
which have a thumbnail illustration. If Javascript is off all the
thumbnails are shown, but if it is on only the first four appear.
--
David Wild using RISC OS on broadband
www.davidhwild.me.uk
which have a thumbnail illustration. If Javascript is off all the
thumbnails are shown, but if it is on only the first four appear.
--
David Wild using RISC OS on broadband
www.davidhwild.me.uk
Re: Big push on testing needed
On Sun, 24 Jan 2016 23:20:33 GMT John Rickman wrote:
> Dave Higton wrote
>
>> Big news...
>
>> Current test (CI) builds are now release candidates. Yes, a new
>> release of NetSurf is imminent.
>
>> Please, everybody, download the latest test build (which will,
>> of course, change as bugs are found and fixed), give it a good
>> thrashing, and get your bug reports in.
>
>> Please also note that, since it's now close to release time, the
>> Javascript setting in Choices->Content is obeyed (and has been
>> for a couple of days or so now).
>
>
> I am happy to download every day, or whenever a new version is
> available.
> Is there any information available about the current state of
> javascript?
That's impossible to answer in any way that is both simple and
meaningful.
The Javascript interpreter was replaced with a different one
a few months ago. The decision was taken not to make a new
stable release of NS until all the Javascript features of the
previous interpreter are present again. However, some other
JS features are also present.
The JS implementation is still far from complete, though. If
you need a RISC OS browser with complete (-ish) Javascript at
the moment, you'll have to use Otter or QupZilla, and put up
with the slower speeds.
To look at your question from a different angle: is there a
particular Javascript feature that you need?
Dave
____________________________________________________________
FREE 3D EARTH SCREENSAVER - Watch the Earth right on your desktop!
Check it out at http://www.inbox.com/earth
> Dave Higton wrote
>
>> Big news...
>
>> Current test (CI) builds are now release candidates. Yes, a new
>> release of NetSurf is imminent.
>
>> Please, everybody, download the latest test build (which will,
>> of course, change as bugs are found and fixed), give it a good
>> thrashing, and get your bug reports in.
>
>> Please also note that, since it's now close to release time, the
>> Javascript setting in Choices->Content is obeyed (and has been
>> for a couple of days or so now).
>
>
> I am happy to download every day, or whenever a new version is
> available.
> Is there any information available about the current state of
> javascript?
That's impossible to answer in any way that is both simple and
meaningful.
The Javascript interpreter was replaced with a different one
a few months ago. The decision was taken not to make a new
stable release of NS until all the Javascript features of the
previous interpreter are present again. However, some other
JS features are also present.
The JS implementation is still far from complete, though. If
you need a RISC OS browser with complete (-ish) Javascript at
the moment, you'll have to use Otter or QupZilla, and put up
with the slower speeds.
To look at your question from a different angle: is there a
particular Javascript feature that you need?
Dave
____________________________________________________________
FREE 3D EARTH SCREENSAVER - Watch the Earth right on your desktop!
Check it out at http://www.inbox.com/earth
Sunday, 24 January 2016
Re: Big push on testing needed
In article <3bc72c4755.DaveMeUK@my.inbox.com>,
Dave Higton <dave@davehigton.me.uk> wrote:
> Big news...
> Current test (CI) builds are now release candidates. Yes, a new
> release of NetSurf is imminent.
> Please, everybody, download the latest test build (which will,
> of course, change as bugs are found and fixed), give it a good
> thrashing, and get your bug reports in.
> Please also note that, since it's now close to release time, the
> Javascript setting in Choices->Content is obeyed (and has been
> for a couple of days or so now).
> Dave
Fetching websites seems to storm along at a rate of knots. Has something
been done to NetSurf to enable this?
Dave Higton <dave@davehigton.me.uk> wrote:
> Big news...
> Current test (CI) builds are now release candidates. Yes, a new
> release of NetSurf is imminent.
> Please, everybody, download the latest test build (which will,
> of course, change as bugs are found and fixed), give it a good
> thrashing, and get your bug reports in.
> Please also note that, since it's now close to release time, the
> Javascript setting in Choices->Content is obeyed (and has been
> for a couple of days or so now).
> Dave
Fetching websites seems to storm along at a rate of knots. Has something
been done to NetSurf to enable this?
Re: Big push on testing needed
Dave Higton wrote
> Big news...
> Current test (CI) builds are now release candidates. Yes, a new
> release of NetSurf is imminent.
> Please, everybody, download the latest test build (which will,
> of course, change as bugs are found and fixed), give it a good
> thrashing, and get your bug reports in.
> Please also note that, since it's now close to release time, the
> Javascript setting in Choices->Content is obeyed (and has been
> for a couple of days or so now).
I am happy to download every day, or whenever a new version is
available.
Is there any information available about the current state of
javascript?
John
--
John Rickman - http://rickman.orpheusweb.co.uk/lynx
Infinity is Sad. God is infinite. God is very sad.
> Big news...
> Current test (CI) builds are now release candidates. Yes, a new
> release of NetSurf is imminent.
> Please, everybody, download the latest test build (which will,
> of course, change as bugs are found and fixed), give it a good
> thrashing, and get your bug reports in.
> Please also note that, since it's now close to release time, the
> Javascript setting in Choices->Content is obeyed (and has been
> for a couple of days or so now).
I am happy to download every day, or whenever a new version is
available.
Is there any information available about the current state of
javascript?
John
--
John Rickman - http://rickman.orpheusweb.co.uk/lynx
Infinity is Sad. God is infinite. God is very sad.
Big push on testing needed
Big news...
Current test (CI) builds are now release candidates. Yes, a new
release of NetSurf is imminent.
Please, everybody, download the latest test build (which will,
of course, change as bugs are found and fixed), give it a good
thrashing, and get your bug reports in.
Please also note that, since it's now close to release time, the
Javascript setting in Choices->Content is obeyed (and has been
for a couple of days or so now).
Dave
____________________________________________________________
FREE ONLINE PHOTOSHARING - Share your photos online with your friends and family!
Visit http://www.inbox.com/photosharing to find out more!
Current test (CI) builds are now release candidates. Yes, a new
release of NetSurf is imminent.
Please, everybody, download the latest test build (which will,
of course, change as bugs are found and fixed), give it a good
thrashing, and get your bug reports in.
Please also note that, since it's now close to release time, the
Javascript setting in Choices->Content is obeyed (and has been
for a couple of days or so now).
Dave
____________________________________________________________
FREE ONLINE PHOTOSHARING - Share your photos online with your friends and family!
Visit http://www.inbox.com/photosharing to find out more!
Re: JavaScript enabled builds for Atari
On 23/01/16 23:06, Peter Slegg wrote:
> Using Netsurf JS today it has crashed a few times.
> It isn't as stable as the last non-JS versions.
How about when you use it with "enable_javascript:0"?
--
Michael Drake http://www.netsurf-browser.org/
> Using Netsurf JS today it has crashed a few times.
> It isn't as stable as the last non-JS versions.
How about when you use it with "enable_javascript:0"?
--
Michael Drake http://www.netsurf-browser.org/
Saturday, 23 January 2016
Re: JavaScript enabled builds for Atari
On Sat, 23 Jan 2016 13:00:14 , Peter Slegg <p.slegg@scubadivers.co.uk> wrote:
>
> I spotted this in the git commits last night. I tried the build but
> I don't think the setting is exposed in the Atari Choices dialogue.
>
> I just tried build 3296 but I wasn't sure whether 0 or 1 was on.
> After a couple of failed attempts, I think I needed an extra LF in
> the choices file, I seem to have JS working.
>
> The link above gives a character based Mandelbrot:
>
> ..,,,,:::::::::::::::;;;;;;;;;;=¦¦¦¦i++E EHHHO
> ,,,:::::::::::::::;;;;;;;;;;==¦¦¦¦iHhhE ###
>
> Brilliant :-)
>
> I am off to try some more tests.
>
> Many thanks all.
>
> Regards,
>
> Peter
>
>
Using Netsurf JS today it has crashed a few times.
It isn't as stable as the last non-JS versions.
Peter
>
> I spotted this in the git commits last night. I tried the build but
> I don't think the setting is exposed in the Atari Choices dialogue.
>
> I just tried build 3296 but I wasn't sure whether 0 or 1 was on.
> After a couple of failed attempts, I think I needed an extra LF in
> the choices file, I seem to have JS working.
>
> The link above gives a character based Mandelbrot:
>
> ..,,,,:::::::::::::::;;;;;;;;;;=¦¦¦¦i++E EHHHO
> ,,,:::::::::::::::;;;;;;;;;;==¦¦¦¦iHhhE ###
>
> Brilliant :-)
>
> I am off to try some more tests.
>
> Many thanks all.
>
> Regards,
>
> Peter
>
>
Using Netsurf JS today it has crashed a few times.
It isn't as stable as the last non-JS versions.
Peter
Re: JavaScript enabled builds for Atari
On Saturday 23 January 2016 13:00:14 Peter Slegg wrote:
> > Date: Sat, 23 Jan 2016 11:22:09 +0000
> > From: Michael Drake <tlsa@netsurf-browser.org>
> > Subject: JavaScript enabled builds for Atari
> > We have enabled JavaScript builds (json) for Atari on the CI server.
> > I'd be interested to get confirmation on whether its working from
> > Atari users. You can test by visiting:
> >
> > http://git.netsurf-browser.org/netsurf.git/plain/test/js/js-fractal.html
> I just tried build 3296 but I wasn't sure whether 0 or 1 was on.
> After a couple of failed attempts, I think I needed an extra LF in
> the choices file, I seem to have JS working.
>
> The link above gives a character based Mandelbrot:
I confirm that the test page is working with the Coldfire build too. Very nice
development!
Cheers,
JFL
--
Jean-François Lemaire
> > Date: Sat, 23 Jan 2016 11:22:09 +0000
> > From: Michael Drake <tlsa@netsurf-browser.org>
> > Subject: JavaScript enabled builds for Atari
> > We have enabled JavaScript builds (json) for Atari on the CI server.
> > I'd be interested to get confirmation on whether its working from
> > Atari users. You can test by visiting:
> >
> > http://git.netsurf-browser.org/netsurf.git/plain/test/js/js-fractal.html
> I just tried build 3296 but I wasn't sure whether 0 or 1 was on.
> After a couple of failed attempts, I think I needed an extra LF in
> the choices file, I seem to have JS working.
>
> The link above gives a character based Mandelbrot:
I confirm that the test page is working with the Coldfire build too. Very nice
development!
Cheers,
JFL
--
Jean-François Lemaire
Re: JavaScript enabled builds for Atari
> Date: Sat, 23 Jan 2016 11:22:09 +0000
> From: Michael Drake <tlsa@netsurf-browser.org>
> Subject: JavaScript enabled builds for Atari
>
>
> We have enabled JavaScript builds (json) for Atari on the CI server.
>
> You can get them here:
>
> http://ci.netsurf-browser.org/builds/atari/?C=M;O=D
>
> I'm not sure if the Atari front end exposes the JavaScript option in
> its GUI. If not, you can edit the Choices file setting either:
>
> enable_javascript:0
> or
> enable_javascript:1
>
> I'd be interested to get confirmation on whether its working from
> Atari users. You can test by visiting:
>
> http://git.netsurf-browser.org/netsurf.git/plain/test/js/js-fractal.html
>
I spotted this in the git commits last night. I tried the build but
I don't think the setting is exposed in the Atari Choices dialogue.
I just tried build 3296 but I wasn't sure whether 0 or 1 was on.
After a couple of failed attempts, I think I needed an extra LF in
the choices file, I seem to have JS working.
The link above gives a character based Mandelbrot:
.,,,,:::::::::::::::;;;;;;;;;;=¦¦¦¦i++E EHHHO
,,,:::::::::::::::;;;;;;;;;;==¦¦¦¦iHhhE ###
Brilliant :-)
I am off to try some more tests.
Many thanks all.
Regards,
Peter
> From: Michael Drake <tlsa@netsurf-browser.org>
> Subject: JavaScript enabled builds for Atari
>
>
> We have enabled JavaScript builds (json) for Atari on the CI server.
>
> You can get them here:
>
> http://ci.netsurf-browser.org/builds/atari/?C=M;O=D
>
> I'm not sure if the Atari front end exposes the JavaScript option in
> its GUI. If not, you can edit the Choices file setting either:
>
> enable_javascript:0
> or
> enable_javascript:1
>
> I'd be interested to get confirmation on whether its working from
> Atari users. You can test by visiting:
>
> http://git.netsurf-browser.org/netsurf.git/plain/test/js/js-fractal.html
>
I spotted this in the git commits last night. I tried the build but
I don't think the setting is exposed in the Atari Choices dialogue.
I just tried build 3296 but I wasn't sure whether 0 or 1 was on.
After a couple of failed attempts, I think I needed an extra LF in
the choices file, I seem to have JS working.
The link above gives a character based Mandelbrot:
.,,,,:::::::::::::::;;;;;;;;;;=¦¦¦¦i++E EHHHO
,,,:::::::::::::::;;;;;;;;;;==¦¦¦¦iHhhE ###
Brilliant :-)
I am off to try some more tests.
Many thanks all.
Regards,
Peter
Re: linux surface
On 22/01/16 14:56, Claus Muus wrote:
> is it correct, that there is no input devices available for the linux
> surface of the netsurf browser?
Yes, I believe there is no input support for the linux framebuffer
nsfb surface at the moment.
> And if yes, is there a way to control the browser if using the linux
> surface?
I don't think so.
> As an alternative I try the sdl surface, but it's so much bigger and
> I'm looking for a as small as possible browser. OK, if the sdl should
> be the only way, since I need to control the netsurf. Is it possible
> to compile sdl without all the (for me) unneeded stuff like sound
> support?
I'm not sure.
--
Michael Drake http://www.netsurf-browser.org/
> is it correct, that there is no input devices available for the linux
> surface of the netsurf browser?
Yes, I believe there is no input support for the linux framebuffer
nsfb surface at the moment.
> And if yes, is there a way to control the browser if using the linux
> surface?
I don't think so.
> As an alternative I try the sdl surface, but it's so much bigger and
> I'm looking for a as small as possible browser. OK, if the sdl should
> be the only way, since I need to control the netsurf. Is it possible
> to compile sdl without all the (for me) unneeded stuff like sound
> support?
I'm not sure.
--
Michael Drake http://www.netsurf-browser.org/
JavaScript enabled builds for Atari
We have enabled JavaScript builds (json) for Atari on the CI server.
You can get them here:
http://ci.netsurf-browser.org/builds/atari/?C=M;O=D
I'm not sure if the Atari front end exposes the JavaScript option in
its GUI. If not, you can edit the Choices file setting either:
enable_javascript:0
or
enable_javascript:1
I'd be interested to get confirmation on whether its working from
Atari users. You can test by visiting:
http://git.netsurf-browser.org/netsurf.git/plain/test/js/js-fractal.html
--
Michael Drake http://www.netsurf-browser.org/
You can get them here:
http://ci.netsurf-browser.org/builds/atari/?C=M;O=D
I'm not sure if the Atari front end exposes the JavaScript option in
its GUI. If not, you can edit the Choices file setting either:
enable_javascript:0
or
enable_javascript:1
I'd be interested to get confirmation on whether its working from
Atari users. You can test by visiting:
http://git.netsurf-browser.org/netsurf.git/plain/test/js/js-fractal.html
--
Michael Drake http://www.netsurf-browser.org/
Friday, 22 January 2016
linux surface
Hi,
is it correct, that there is no input devices available for the linux
surface of the netsurf browser? And if yes, is there a way to control
the browser if using the linux surface?
I'm requesting this, since I have the netsurf running with the linux
surface, but can't control it by keyboard or mouse.
As an alternative I try the sdl surface, but it's so much bigger and
I'm looking for a as small as possible browser. OK, if the sdl should
be the only way, since I need to control the netsurf. Is it possible
to compile sdl without all the (for me) unneeded stuff like sound
support?
Best thanks
is it correct, that there is no input devices available for the linux
surface of the netsurf browser? And if yes, is there a way to control
the browser if using the linux surface?
I'm requesting this, since I have the netsurf running with the linux
surface, but can't control it by keyboard or mouse.
As an alternative I try the sdl surface, but it's so much bigger and
I'm looking for a as small as possible browser. OK, if the sdl should
be the only way, since I need to control the netsurf. Is it possible
to compile sdl without all the (for me) unneeded stuff like sound
support?
Best thanks
linux surface
Hi,
is it correct, that there is no input devices available for the linux surface of the netsurf browser? And if yes, is there a way to control the browser if using the linux surface?
I'm requesting this, since I have the netsurf running with the linux surface, but can't control it by keyboard or mouse.
As an alternative I try the sdl surface, but it's so much bigger and I'm looking for a as small as possible browser. OK, if the sdl should be the only way, since I need to control the netsurf. Is it possible to compile sdl without all the (for me) unneeded stuff like sound support?
Best thanks
Monday, 18 January 2016
Re: Builds as reflected on changes page under Build History
On Mon, Jan 18, 2016 at 11:39:23AM +0000, John Williams wrote:
>
> I have noticed a few times, including as I write this, that sometimes the
> progress of a build seems to get stuck as shown at:
>
> http://ci.netsurf-browser.org/jenkins/view/All/job/netsurf/changes
>
> Today the progress is still stuck from Jan 15, 2016 7:37 on #3269 at, say,
> 95% completion, and a build #3270 remains "pending".
>
> RISC OS #3269 has downloaded fine, but presumably something has stuck at a
> later point for another OS.
>
> Why/how does this happen?
The CI system is built around the concepts of jobs, each job
represents a task the system needs to perform such as building NetSurf
binaries or packages etc.
Each job requires some resources usually provided by one or more
worker systems. So for example the master branch NetSurf build
requires a worker (we are transitioning to using the controller/worker
naming from master/slave after some very spirited complaints) for each
compiled architecture (e.g. cislave4 compiles for arm-unknown-riscos).
What you are seeing in your example is where a resources is no longer
unavailable to the CI system (specifically cislave3 running the haiku
OS). The failure means the in progress job cannot complete and new jobs that
require that resource cannot commence.
It requires an administrator (me) to come along and reboot the failed
system and restart the CI worker software to make the resources
available again and allow the job to complete.
>
> Also, sometimes there is nothing listed under Changes for versions - not
> even a build reference.
>
Builds can be manually started by developers if there has been an
issue during a build, these will have no code changes and be anonymous
> Just curious!
>
> John
>
> --
> | John Williams
> | johnrw@ukgateway.net
>
> I think, therefore I am unsure - I think! *
>
>
--
Regards Vincent
http://www.kyllikki.org/
>
> I have noticed a few times, including as I write this, that sometimes the
> progress of a build seems to get stuck as shown at:
>
> http://ci.netsurf-browser.org/jenkins/view/All/job/netsurf/changes
>
> Today the progress is still stuck from Jan 15, 2016 7:37 on #3269 at, say,
> 95% completion, and a build #3270 remains "pending".
>
> RISC OS #3269 has downloaded fine, but presumably something has stuck at a
> later point for another OS.
>
> Why/how does this happen?
The CI system is built around the concepts of jobs, each job
represents a task the system needs to perform such as building NetSurf
binaries or packages etc.
Each job requires some resources usually provided by one or more
worker systems. So for example the master branch NetSurf build
requires a worker (we are transitioning to using the controller/worker
naming from master/slave after some very spirited complaints) for each
compiled architecture (e.g. cislave4 compiles for arm-unknown-riscos).
What you are seeing in your example is where a resources is no longer
unavailable to the CI system (specifically cislave3 running the haiku
OS). The failure means the in progress job cannot complete and new jobs that
require that resource cannot commence.
It requires an administrator (me) to come along and reboot the failed
system and restart the CI worker software to make the resources
available again and allow the job to complete.
>
> Also, sometimes there is nothing listed under Changes for versions - not
> even a build reference.
>
Builds can be manually started by developers if there has been an
issue during a build, these will have no code changes and be anonymous
> Just curious!
>
> John
>
> --
> | John Williams
> | johnrw@ukgateway.net
>
> I think, therefore I am unsure - I think! *
>
>
--
Regards Vincent
http://www.kyllikki.org/
Builds as reflected on changes page under Build History
I have noticed a few times, including as I write this, that sometimes the
progress of a build seems to get stuck as shown at:
http://ci.netsurf-browser.org/jenkins/view/All/job/netsurf/changes
Today the progress is still stuck from Jan 15, 2016 7:37 on #3269 at, say,
95% completion, and a build #3270 remains "pending".
RISC OS #3269 has downloaded fine, but presumably something has stuck at a
later point for another OS.
Why/how does this happen?
Also, sometimes there is nothing listed under Changes for versions - not
even a build reference.
Just curious!
John
--
| John Williams
| johnrw@ukgateway.net
I think, therefore I am unsure - I think! *
progress of a build seems to get stuck as shown at:
http://ci.netsurf-browser.org/jenkins/view/All/job/netsurf/changes
Today the progress is still stuck from Jan 15, 2016 7:37 on #3269 at, say,
95% completion, and a build #3270 remains "pending".
RISC OS #3269 has downloaded fine, but presumably something has stuck at a
later point for another OS.
Why/how does this happen?
Also, sometimes there is nothing listed under Changes for versions - not
even a build reference.
Just curious!
John
--
| John Williams
| johnrw@ukgateway.net
I think, therefore I am unsure - I think! *
Saturday, 16 January 2016
Re: Very slow page rendering
> Date: 10 Jan 2016 14:24:53 +0000
> From: "Chris Young" <chris.young@unsatisfactorysoftware.co.uk>
> Subject: Re: Very slow page rendering
> To: <netsurf-users@netsurf-browser.org>
>
>
> I notice the Atari code has three font rendering engines; VDI,
> FreeType and Internal. It look like FreeType and Internal are
> enabled, and FreeType is the default.
>
> Try setting the following in your Choices file:
> atari_font_driver:internal
>
> See if that makes any perceivable difference.
>
> Chris
>
Worth a try but it still took a little over 18 minutes.
I reckon about the same time as it was before.
Regards,
Peter
Atari CI build 3260
> From: "Chris Young" <chris.young@unsatisfactorysoftware.co.uk>
> Subject: Re: Very slow page rendering
> To: <netsurf-users@netsurf-browser.org>
>
>
> I notice the Atari code has three font rendering engines; VDI,
> FreeType and Internal. It look like FreeType and Internal are
> enabled, and FreeType is the default.
>
> Try setting the following in your Choices file:
> atari_font_driver:internal
>
> See if that makes any perceivable difference.
>
> Chris
>
Worth a try but it still took a little over 18 minutes.
I reckon about the same time as it was before.
Regards,
Peter
Atari CI build 3260
Sunday, 10 January 2016
Re: Very slow page rendering
On Saturday 09 January 2016 19:43:53 Peter Slegg wrote:
> >> Peter Slegg <p.slegg@scubadivers.co.uk> wrote:
> >>>http://git.netsurf-browser.org/netsurf.git/tree/atari/gemtk/guiwin.c
> >>>
> >>>This page takes abut 20mins to download and render, Highwire browser
> >>>takes about 6sec.
> No criticism, I am hoping this might help the devs find some speed
> improvements.
I can confirm with an older build that this page takes an insanely long time
to load with the Atari build. Since this seems to be specific to the Atari
port I suppose only an Atari developer could figure out what is going on.
Cheers,
JFL
--
Jean-François Lemaire
> >> Peter Slegg <p.slegg@scubadivers.co.uk> wrote:
> >>>http://git.netsurf-browser.org/netsurf.git/tree/atari/gemtk/guiwin.c
> >>>
> >>>This page takes abut 20mins to download and render, Highwire browser
> >>>takes about 6sec.
> No criticism, I am hoping this might help the devs find some speed
> improvements.
I can confirm with an older build that this page takes an insanely long time
to load with the Atari build. Since this seems to be specific to the Atari
port I suppose only an Atari developer could figure out what is going on.
Cheers,
JFL
--
Jean-François Lemaire
Re: Very slow page rendering
On Saturday 09 January 2016 19:43:53 Peter Slegg wrote:
> >Date: Sat, 09 Jan 2016 16:11:40 GMT
> >> Peter Slegg <p.slegg@scubadivers.co.uk> wrote:
> >>>http://git.netsurf-browser.org/netsurf.git/tree/atari/gemtk/guiwin.c
> >>>This page takes abut 20mins to download and render, Highwire browser
> >>>takes about 6sec.
> No criticism, I am hoping this might help the devs find some speed
> improvements.
I have an Atari 2.9 version lying around and with that build it takes 100 secs
to render that page. Still very slow but much less so than with the 3.*
builds.
Cheers,
JFL
--
Jean-François Lemaire
> >Date: Sat, 09 Jan 2016 16:11:40 GMT
> >> Peter Slegg <p.slegg@scubadivers.co.uk> wrote:
> >>>http://git.netsurf-browser.org/netsurf.git/tree/atari/gemtk/guiwin.c
> >>>This page takes abut 20mins to download and render, Highwire browser
> >>>takes about 6sec.
> No criticism, I am hoping this might help the devs find some speed
> improvements.
I have an Atari 2.9 version lying around and with that build it takes 100 secs
to render that page. Still very slow but much less so than with the 3.*
builds.
Cheers,
JFL
--
Jean-François Lemaire
Re: Very slow page rendering
On Sat, 9 Jan 2016 19:43:53 +0000 (GMT), Peter Slegg wrote:
> >>>http://git.netsurf-browser.org/netsurf.git/tree/atari/gemtk/guiwin.c
> >>>
> >>>This page takes abut 20mins to download and render, Highwire browser
> >>>takes about 6sec.
> >
> >> I just tried it with CI#3254 on an Iyonix. Took about 24 sec.
> >
> >> If you're not on a very recent CI build, I would recommend you get
> >> one. Very slow rendering was fixed a couple of months ago.
> >
> >About 43s here on RPC with standard ADSL connection. CI#3250. Firefox
> >on the macbook pro takes 2s over the same connection.
>
> I am using a build from today and I know it is never going to be
> lighting quick on an M68060 but my point is that the speed seems
> unduly slow over an adsl link that normally downloads at 350k/s.
>
> I have reported before that Netsurf on the Atari seems slow at
> downloading pages so it might just be an issue with how Netsurf is
> using the OS ? It's not my area of expertise.
>
> Is it because there are a lot of files ?
> Is it a cache issue ?
>
> Long delays can often be observed on other pages. The git url is
> a useful test case because it is fairly stable and repeatable.
>
> No criticism, I am hoping this might help the devs find some speed
> improvements.
There is a lot of CSS on those git pages, I believe some CSS caching
is being looked into to reduce the amount of processing required.
In my dabblings with building a 68k version of NetSurf for AmigaOS 3,
I've noticed that the text layout and rendering code plays a big part
in the speed NetSurf loads pages. On AmigaOS 3, rendering outline
font glyphs is painfully slow, and I think this is down to the 68k.
I notice the Atari code has three font rendering engines; VDI,
FreeType and Internal. It look like FreeType and Internal are
enabled, and FreeType is the default.
Try setting the following in your Choices file:
atari_font_driver:internal
See if that makes any perceivable difference.
Chris
> >>>http://git.netsurf-browser.org/netsurf.git/tree/atari/gemtk/guiwin.c
> >>>
> >>>This page takes abut 20mins to download and render, Highwire browser
> >>>takes about 6sec.
> >
> >> I just tried it with CI#3254 on an Iyonix. Took about 24 sec.
> >
> >> If you're not on a very recent CI build, I would recommend you get
> >> one. Very slow rendering was fixed a couple of months ago.
> >
> >About 43s here on RPC with standard ADSL connection. CI#3250. Firefox
> >on the macbook pro takes 2s over the same connection.
>
> I am using a build from today and I know it is never going to be
> lighting quick on an M68060 but my point is that the speed seems
> unduly slow over an adsl link that normally downloads at 350k/s.
>
> I have reported before that Netsurf on the Atari seems slow at
> downloading pages so it might just be an issue with how Netsurf is
> using the OS ? It's not my area of expertise.
>
> Is it because there are a lot of files ?
> Is it a cache issue ?
>
> Long delays can often be observed on other pages. The git url is
> a useful test case because it is fairly stable and repeatable.
>
> No criticism, I am hoping this might help the devs find some speed
> improvements.
There is a lot of CSS on those git pages, I believe some CSS caching
is being looked into to reduce the amount of processing required.
In my dabblings with building a 68k version of NetSurf for AmigaOS 3,
I've noticed that the text layout and rendering code plays a big part
in the speed NetSurf loads pages. On AmigaOS 3, rendering outline
font glyphs is painfully slow, and I think this is down to the 68k.
I notice the Atari code has three font rendering engines; VDI,
FreeType and Internal. It look like FreeType and Internal are
enabled, and FreeType is the default.
Try setting the following in your Choices file:
atari_font_driver:internal
See if that makes any perceivable difference.
Chris
Re: Very slow page rendering
> This page takes abut 20mins to download and render, Highwire browser
> takes about 6sec.
> FWIW on this virtual RPC with JavaScript and css running the
> download/render process takes c10.5 seconds.
> c6.5 secs Win7 VRPC with JS
> I just tried it with CI#3254 on an Iyonix. Took about 24 sec.
> About 4 seconds here, ARMX6 and NS 3254
> About 43s here on RPC with standard ADSL connection. CI#3250. Firefox
> on the macbook pro takes 2s over the same connection.
> c.7 secs here (#3250, JS enabled, Pi2 running 5.21 RC14).
> Takes 34.9 seconds on this SA RPC using Netsurf 3.3.
Interesting figures.
On my ARMiniX with Dev Cl #3244 and Javascript enabled, it takes a bit less
than 4 seconds, everytime.
Kind regards,
Paul Sprangers
> takes about 6sec.
> FWIW on this virtual RPC with JavaScript and css running the
> download/render process takes c10.5 seconds.
> c6.5 secs Win7 VRPC with JS
> I just tried it with CI#3254 on an Iyonix. Took about 24 sec.
> About 4 seconds here, ARMX6 and NS 3254
> About 43s here on RPC with standard ADSL connection. CI#3250. Firefox
> on the macbook pro takes 2s over the same connection.
> c.7 secs here (#3250, JS enabled, Pi2 running 5.21 RC14).
> Takes 34.9 seconds on this SA RPC using Netsurf 3.3.
Interesting figures.
On my ARMiniX with Dev Cl #3244 and Javascript enabled, it takes a bit less
than 4 seconds, everytime.
Kind regards,
Paul Sprangers
Saturday, 9 January 2016
Re: Very slow page rendering
>Date: Sat, 09 Jan 2016 16:11:40 GMT
>From: Richard Porter <ricp@minijem.plus.com>
>Subject: Re: Very slow page rendering
>To: netsurf-users@netsurf-browser.org
>On 9 Jan 2016 Dave Higton wrote:
>
>> In message <000a3379.01eff490bef9@smtp.freeola.net>
>> Peter Slegg <p.slegg@scubadivers.co.uk> wrote:
>
>>
>>>http://git.netsurf-browser.org/netsurf.git/tree/atari/gemtk/guiwin.c
>>>
>>>This page takes abut 20mins to download and render, Highwire browser
>>>takes about 6sec.
>
>> I just tried it with CI#3254 on an Iyonix. Took about 24 sec.
>
>> If you're not on a very recent CI build, I would recommend you get
>> one. Very slow rendering was fixed a couple of months ago.
>
>About 43s here on RPC with standard ADSL connection. CI#3250. Firefox
>on the macbook pro takes 2s over the same connection.
I am using a build from today and I know it is never going to be
lighting quick on an M68060 but my point is that the speed seems
unduly slow over an adsl link that normally downloads at 350k/s.
I have reported before that Netsurf on the Atari seems slow at
downloading pages so it might just be an issue with how Netsurf is
using the OS ? It's not my area of expertise.
Is it because there are a lot of files ?
Is it a cache issue ?
Long delays can often be observed on other pages. The git url is
a useful test case because it is fairly stable and repeatable.
No criticism, I am hoping this might help the devs find some speed
improvements.
Regards,
Peter
>From: Richard Porter <ricp@minijem.plus.com>
>Subject: Re: Very slow page rendering
>To: netsurf-users@netsurf-browser.org
>On 9 Jan 2016 Dave Higton wrote:
>
>> In message <000a3379.01eff490bef9@smtp.freeola.net>
>> Peter Slegg <p.slegg@scubadivers.co.uk> wrote:
>
>>
>>>http://git.netsurf-browser.org/netsurf.git/tree/atari/gemtk/guiwin.c
>>>
>>>This page takes abut 20mins to download and render, Highwire browser
>>>takes about 6sec.
>
>> I just tried it with CI#3254 on an Iyonix. Took about 24 sec.
>
>> If you're not on a very recent CI build, I would recommend you get
>> one. Very slow rendering was fixed a couple of months ago.
>
>About 43s here on RPC with standard ADSL connection. CI#3250. Firefox
>on the macbook pro takes 2s over the same connection.
I am using a build from today and I know it is never going to be
lighting quick on an M68060 but my point is that the speed seems
unduly slow over an adsl link that normally downloads at 350k/s.
I have reported before that Netsurf on the Atari seems slow at
downloading pages so it might just be an issue with how Netsurf is
using the OS ? It's not my area of expertise.
Is it because there are a lot of files ?
Is it a cache issue ?
Long delays can often be observed on other pages. The git url is
a useful test case because it is fairly stable and repeatable.
No criticism, I am hoping this might help the devs find some speed
improvements.
Regards,
Peter
Re: Very slow page rendering
In article <553f4721debrian.jordan9@btinternet.com>,
Brian Jordan <brian.jordan9@btinternet.com> wrote:
> In article <000a3379.01eff490bef9@smtp.freeola.net>,
> Peter Slegg <p.slegg@scubadivers.co.uk> wrote:
> > http://git.netsurf-browser.org/netsurf.git/tree/atari/gemtk/guiwin.c
> > This page takes abut 20mins to download and render, Highwire browser
> > takes about 6sec.
> > I think the difference is that Highwire doesn't handle the css
> > so maybe there could be some performance gains to be had in
> > either downloading or the css handling ?
> FWIW on this virtual RPC with JavaScript and css running the
> download/render process takes c10.5 seconds.
Takes 34.9 seconds on this SA RPC using Netsurf 3.3. Tried it on Windows
box on Firefox and it loaded in less than a couple of seconds. Display on
both Firefox and Netsurf seemed essentially the same - just a heading and
then a page of 1430 numbered lines of code. Is that what it is supposed to
look like? Doesn't seem to be much rendering going on!
Alan
--
Alan Calder, Milton Keynes, UK.
Brian Jordan <brian.jordan9@btinternet.com> wrote:
> In article <000a3379.01eff490bef9@smtp.freeola.net>,
> Peter Slegg <p.slegg@scubadivers.co.uk> wrote:
> > http://git.netsurf-browser.org/netsurf.git/tree/atari/gemtk/guiwin.c
> > This page takes abut 20mins to download and render, Highwire browser
> > takes about 6sec.
> > I think the difference is that Highwire doesn't handle the css
> > so maybe there could be some performance gains to be had in
> > either downloading or the css handling ?
> FWIW on this virtual RPC with JavaScript and css running the
> download/render process takes c10.5 seconds.
Takes 34.9 seconds on this SA RPC using Netsurf 3.3. Tried it on Windows
box on Firefox and it loaded in less than a couple of seconds. Display on
both Firefox and Netsurf seemed essentially the same - just a heading and
then a page of 1430 numbered lines of code. Is that what it is supposed to
look like? Doesn't seem to be much rendering going on!
Alan
--
Alan Calder, Milton Keynes, UK.
Re: Very slow page rendering
In message <3dd24d3f55.DaveMeUK@my.inbox.com>
Dave Higton <dave@davehigton.me.uk> wrote:
> In message <000a3379.01eff490bef9@smtp.freeola.net>
> Peter Slegg <p.slegg@scubadivers.co.uk> wrote:
>>
>>http://git.netsurf-browser.org/netsurf.git/tree/atari/gemtk/guiwin.c
>>
>>This page takes abut 20mins to download and render, Highwire browser
>>takes about 6sec.
> I just tried it with CI#3254 on an Iyonix. Took about 24 sec.
> If you're not on a very recent CI build, I would recommend you get
> one. Very slow rendering was fixed a couple of months ago.
> Dave
> ____________________________________________________________
> FREE 3D MARINE AQUARIUM SCREENSAVER - Watch dolphins, sharks & orcas
> on your desktop!
> Check it out at http://www.inbox.com/marineaquarium
c.7 secs here (#3250, JS enabled, Pi2 running 5.21 RC14).
--
George
Dave Higton <dave@davehigton.me.uk> wrote:
> In message <000a3379.01eff490bef9@smtp.freeola.net>
> Peter Slegg <p.slegg@scubadivers.co.uk> wrote:
>>
>>http://git.netsurf-browser.org/netsurf.git/tree/atari/gemtk/guiwin.c
>>
>>This page takes abut 20mins to download and render, Highwire browser
>>takes about 6sec.
> I just tried it with CI#3254 on an Iyonix. Took about 24 sec.
> If you're not on a very recent CI build, I would recommend you get
> one. Very slow rendering was fixed a couple of months ago.
> Dave
> ____________________________________________________________
> FREE 3D MARINE AQUARIUM SCREENSAVER - Watch dolphins, sharks & orcas
> on your desktop!
> Check it out at http://www.inbox.com/marineaquarium
c.7 secs here (#3250, JS enabled, Pi2 running 5.21 RC14).
--
George
Re: Very slow page rendering
On 9 Jan 2016 Dave Higton wrote:
> In message <000a3379.01eff490bef9@smtp.freeola.net>
> Peter Slegg <p.slegg@scubadivers.co.uk> wrote:
>>
>>http://git.netsurf-browser.org/netsurf.git/tree/atari/gemtk/guiwin.c
>>
>>This page takes abut 20mins to download and render, Highwire browser
>>takes about 6sec.
> I just tried it with CI#3254 on an Iyonix. Took about 24 sec.
> If you're not on a very recent CI build, I would recommend you get
> one. Very slow rendering was fixed a couple of months ago.
About 43s here on RPC with standard ADSL connection. CI#3250. Firefox
on the macbook pro takes 2s over the same connection.
--
Richard Porter http://www.minijem.plus.com/
Skype: minijem2 mailto:ricp@minijem.plus.com
I don't want a "user experience" - I just want stuff that works.
> In message <000a3379.01eff490bef9@smtp.freeola.net>
> Peter Slegg <p.slegg@scubadivers.co.uk> wrote:
>>
>>http://git.netsurf-browser.org/netsurf.git/tree/atari/gemtk/guiwin.c
>>
>>This page takes abut 20mins to download and render, Highwire browser
>>takes about 6sec.
> I just tried it with CI#3254 on an Iyonix. Took about 24 sec.
> If you're not on a very recent CI build, I would recommend you get
> one. Very slow rendering was fixed a couple of months ago.
About 43s here on RPC with standard ADSL connection. CI#3250. Firefox
on the macbook pro takes 2s over the same connection.
--
Richard Porter http://www.minijem.plus.com/
Skype: minijem2 mailto:ricp@minijem.plus.com
I don't want a "user experience" - I just want stuff that works.
Subscribe to:
Posts (Atom)