dramaticcat@sh.itjust.works to Lemmy Shitpost@lemmy.world · 1 year agoChad scrapersh.itjust.worksimagemessage-square59fedilinkarrow-up1760arrow-down125
arrow-up1735arrow-down1imageChad scrapersh.itjust.worksdramaticcat@sh.itjust.works to Lemmy Shitpost@lemmy.world · 1 year agomessage-square59fedilink
minus-squarebill_1992@lemmy.worldlinkfedilinkarrow-up108arrow-down2·1 year agoEveryone loves the idea of scraping, no one likes maintaining scrapers that break once a week because the CSS or HTML changed.
minus-squarecamr_on@lemmy.worldlinkfedilinkarrow-up17·1 year agoI loved scraping until my ip was blocked for botting lol. I know there’s ways around it it’s just work though
minus-squarePennomi@lemmy.worldlinkfedilinkEnglisharrow-up28·1 year agoI successfully scraped millions of Amazon product listings simply by routing through TOR and cycling the exit node every 10 seconds.
minus-squarecamr_on@lemmy.worldlinkfedilinkarrow-up10·1 year agoThat’s a good idea right there, I like that
minus-squareferret@sh.itjust.workslinkfedilinkEnglisharrow-up4·1 year agolmao, yeah, get all the exit nodes banned from amazon.
minus-squarePennomi@lemmy.worldlinkfedilinkEnglisharrow-up9·1 year agoThat’s the neat thing, it wouldn’t because traffic only spikes for 10s on any particular node. It perfectly blends into the background noise.
minus-squareTouching_Grass@lemmy.worldlinkfedilinkarrow-up5arrow-down1·edit-21 year agoYou guys use IP’s?
minus-squarecamr_on@lemmy.worldlinkfedilinkarrow-up6·1 year agoI’m coding baby’s first bot over here lol, I could probably do better
minus-squaredangblingus@lemmy.worldlinkfedilinkarrow-up9·1 year agoOr in the case of wikipedia, every table on successive pages for sequential data is formatted differently.
minus-squareMatriks404@lemmy.worldlinkfedilinkarrow-up7·1 year agoJust use AI to make changes ¯_(ツ)_/¯
Everyone loves the idea of scraping, no one likes maintaining scrapers that break once a week because the CSS or HTML changed.
I loved scraping until my ip was blocked for botting lol. I know there’s ways around it it’s just work though
I successfully scraped millions of Amazon product listings simply by routing through TOR and cycling the exit node every 10 seconds.
That’s a good idea right there, I like that
This guy scrapes
lmao, yeah, get all the exit nodes banned from amazon.
That’s the neat thing, it wouldn’t because traffic only spikes for 10s on any particular node. It perfectly blends into the background noise.
You guys use IP’s?
I’m coding baby’s first bot over here lol, I could probably do better
Or in the case of wikipedia, every table on successive pages for sequential data is formatted differently.
Just use AI to make changes ¯_(ツ)_/¯
Here take these: \\
¯(ツ)/¯\\ Thanks