1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

How do I save a link to a csv with a script?

Discussion in 'General Scripting Chat' started by niksam, May 21, 2013.

  1. niksam

    niksam Newbie

    Joined:
    May 5, 2013
    Messages:
    15
    Likes Received:
    2
    So I basically want a list of some websites, but i don't wanna copy-and-paste them manually. I want a script that copies the link and save it into the csv on the first line availible in column 1. I'd prefer to do it with iMacro, but if that doesn't work or you doesn't know how to do it, any other program is fine as well.

    Thanks for your time! :)
     
  2. CodingAndStuff

    CodingAndStuff Regular Member

    Joined:
    May 6, 2012
    Messages:
    236
    Likes Received:
    84
    Occupation:
    Swagstronaut
    Location:
    You can't have my bots. Sorry :'(
    Sounds like a small project. I'd suggest using PHP and the Simple HTML DOM Parser which can be found here: http://simplehtmldom.sourceforge.net/ .

    You'll want to use the included "file_get_html()" function within that class if you don't need to send headers or anything (it's just a wrapper for file_get_contents() ), then iterate through the elements on the page, scrape what you want, then dump the data into an array and finally call "implode()" with "," as the delimiter and save as .csv.

    Alternatively you could do it with Javascript with a simple document.getElementsByTagName('element'), iterate through the tags, then dump it into a buffer and output to console when you're done. If you had of provided a link to the data you wanted to scrape I'd have written a snippet for you, but you didn't so...yeah.

    Good luck!