[ILUG] Reading material for travel
kae at verens.com
Wed Feb 15 14:50:54 GMT 2012
you could use wget to recursively download the website?
or download a copy of Wikipedia.
if they're news sites, you're obviously not going to be able to stay
On 15 February 2012 14:43, <ollie at eillo.org> wrote:
> Hi All,
> I have about 10 or 12 webpages that I check/read on a daily basis.
> I'll be going somewhere where I have limited or no internet connectivity.
> Is there any relatively easy way collect this stuff from popular sites and
> dump it in a file, in a format that allows easy reading at a later stage?
> It’s a sort of a weird problem - I want to take part of the internet with
> me for reading material.
> Irish Linux Users' Group mailing list
> About this list : http://mail.linux.ie/mailman/**listinfo/ilug<http://mail.linux.ie/mailman/listinfo/ilug>
> Who we are : http://www.linux.ie/
> Where we are : http://www.linux.ie/map/
More information about the ILUG