Content Copy of webpage

Bhullarz's Avatar
Skilled contributor
using php , how can we collect the text data of a webpage ? I don't have access to the server of the webpage of which I want to copy.
0
shabbir's Avatar, Join Date: Jul 2004
Go4Expert Founder
Something like this

PHP Code:
function getTitle($url
{
    
$doc = new DOMDocument// Create a new DOMDocument object in $doc 
    
$doc->loadHTML(file_get_contents($url)); // Load the contents of our desired website into $doc
    
$a $doc->getElementsByTagName("title"); // Get all of the 'a' XHTML tags and there attributes and store in array $a
    
return $a->item(0)->nodeValue;

It will get the title of the page
0
Bhullarz's Avatar
Skilled contributor
Code:
$url="http://www.go4expert.com";
$doc = new DOMDocument; // Create a new DOMDocument object in $doc 
    $doc->loadHTML(file_get_contents($url)); // Load the contents of our desired website into $doc
    $a = $doc->getElementsByTagName("title"); // Get all of the 'a' XHTML tags and there attributes and store in array $a
    echo $a->item(0)->nodeValue;
IT didn't worked for me. what is the error here?
0
RandiR's Avatar, Join Date: Mar 2009
Light Poster
Use the repro command or script SS_WebPageToText in biterscripting (biterscripting.com for free download).

1. To copy the web page as it is to a local file, use the following command.

repro "URL" > file.txt

URL must begin with an http : / / .

2. To extract only the plain text from the web page and store it to a local file,

script SS_WebPageToText.txt page("URL") > file.txt

This script is available at biterscripting.com / WW_WebPageToText.html .

Randi
0
shabbir's Avatar, Join Date: Jul 2004
Go4Expert Founder
RandiR Looks like the page you are referring to does not exist.
0
RandiR's Avatar, Join Date: Mar 2009
Light Poster
Shabbir:

I had to add spaces in the URLs, otherwise, I guess this site does not allow Links. Just remove the spaces so you can access the pages. In general all these pages are at biterscripting . com .

Randi
0
RandiR's Avatar, Join Date: Mar 2009
Light Poster
Shabbir:

Oops, my mistake. The correct URL for the script SS_WebPageToText is (with spaces inserted)


biterscripting.com / SS_WebPageToText.html

Randi

(I had typed WW instead of SS.)
0
shabbir's Avatar, Join Date: Jul 2004
Go4Expert Founder
The links will be allowed after you up your post count though and yes now it looks correct
0
P455w0rd_Cr4kz's Avatar, Join Date: Jan 2007
Ambitious contributor
Why don't you use a website copier,so you can browse offline. Google for webripper/httrack/webcow/
There's plenty out there,also you can specif what files you want to download {txt,php,html ect}
0
Bhullarz's Avatar
Skilled contributor
Quote:
Originally Posted by P455w0rd_Cr4kz View Post
Why don't you use a website copier,so you can browse offline. Google for webripper/httrack/webcow/
There's plenty out there,also you can specif what files you want to download {txt,php,html ect}
Bro ! We are trying to download the content of the page, not the page itself. So these tools are not of use in this case. It's like copying text data of one file and pasting it into another file.