Friday, 29 January 2016

Re: Filtering webpages

Gavin Wraith <gavin@wra1th.plus.com> wrote:
> In message <mpro.o1oojm0082tl406gk.atdotcodotuk@dotcodotukat.co.uk>
> Vince M Hudd <atdotcodotuk@dotcodotukat.co.uk> wrote:

[blocking domains via the host file]

> > If you can do something similar in your router, you will achieve the
> > same result for any computer on your network.

> Thanks for that tip. I realize that the idea of a webpage being hosted at
> a particular URL, with pictures, styles, javascript, ... etc being loaded
> in from elsewhere is not necessarily realistic.

It's actually very *common* - for scripts in particular, with advertising
and analytics services being the most obvious examples.

You *might* find blocking such things results in a speed gain from NetSurf
on some sites (it depends on a number of factors), and/or you might find the
resulting pages are less cluttered and easier to read.

(TBH, I'm not yet using RISC OS / NetSurf enough [yet] to be able to do
anything more than suggest it as an untested possibility.)

> It is often quite hard to see from the source html text exactly what is
> happening.

True enough.

--
Vince M Hudd
Soft Rock Software

Don't forget to vote in the 2015 RISC OS Awards:
www.riscosawards.co.uk/vote2015.html

No comments:

Post a Comment