Sorry but I can’t think of another word for it right now. This is mostly just venting but also if anyone has a better way to do it I wouldn’t hate to hear it.

I’m trying to set up a home server for all of our family photos. We’re on our way to de-googling, and part of the impetus for the change is that our Google Drive is almost full.We have a few hundred gigs of photos between us. The problem with trying to download your data from Google is that it will only allow you to do so in a reasonable way through Google takeout. First you have to order it. Then you have to wait anywhere from a few hours to a day or two for Google to “prepare” the download. Then you have one week before the takeout “expires.” That’s one week to the minute from the time of the initial request.

I don’t have some kind of fancy California internet, I just have normal home internet and there is just no way to download a 50gig (or 2 gig) file in one go - there are always intrruptions that require restarting the download. But if you try to download the files too many times, Google will give you another error and you have to start over and request a new takeout. Google doesn’t let you download the entire archive either, you have to select each file part individually.

I can’t tell you how many weeks it’s been that I’ve tried to download all of the files before they expire, or google gives me another error.

  • BodilessGaze@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    68
    arrow-down
    2
    ·
    5 months ago

    There’s no financial incentive for them to make is easy to leave Google. Takeout only exists to comply with regulations (e.g. digital markets act), and as usual, they’re doing the bare minimum to not get sued.

  • butitsnotme@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    1
    ·
    5 months ago

    I know it’s not ideal, but if you can afford it, you could rent a VPS in a cloud provider for a week or two, and do the download from Google Takeout on that, and then use sync or similar to copy the files to your own server.

  • Railcar8095@lemm.ee
    link
    fedilink
    English
    arrow-up
    11
    ·
    5 months ago

    Not sure if somebody mentioned, but you can export to one drive. So you can get a 1TB account for a free trial or for a single month and export everything there as simple files, no large zips. Then with the app download to the computer and then cancel one drive.

    Pretend to be in California/EU and then ask full removal of all your data on both Microsoft and google

    • gedaliyah@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 months ago

      This route may be the answer. I didn’t have success so far in setting up a download manager that offered any real improvements over the browser. I wanted to avoid my photos being on two corporate services, but as you say, in theory everything is delete-able.

  • weker01@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    7
    ·
    5 months ago

    Google takeout is the best gdpr compliant platform of all the big tech giants. Amazon for example lets you wait until the very last day they legally can.

    Also they do minimal processing like with the metadata (as others commented) as it is probably how they internally store it and that’s what they need to deliver. The simple fact that you can select what you want to request and not having to download everything about you makes it good in my eyes.

    I actually see good faith compliance with the gdpr in the Plattform

    • gedaliyah@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 months ago

      It could absolutely be worse. The main problem is the lack of flexibility - If I could ask for an extension after downloading 80% of the files over a week, that would be helpful for example. I’m also beginning to suspect that they cap the download speed because I am seeing similar speeds on my home and work network…

  • Symphonic@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    5 months ago

    I have fancy California Internet and the downloads are surprisingly slow and kept slowing down and turning off. It was such a pain to get my data out of takeout.

  • Eager Eagle@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    5 months ago

    A 50GB download takes less than 12h on a 10Mbps internet. And I had a 10Mbps link 10 years ago in a third world country, so maybe check your options with your ISP. 50GB really should not be a problem nowadays.

    • gedaliyah@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 months ago

      It’s not the speed - it’s the interruptions. If I could guarantee an uninterrupted download for 12 hours, then I could do it over the course of 3-4 days. I’m looking into some of the download management tools that people here have suggested.

      • Eager Eagle@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        5 months ago

        that might work; I don’t know if you live in a remote area, but I’d also consider a coffee shop, library, university, or hotel lobby with wifi. You might be able to download it within an hour.

  • Resol van Lemmy@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    5 months ago

    It’s bad because they don’t want you to use it, but they made it exist so that they don’t get sued by the European Union.

  • irotsoma@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    5 months ago

    Use Drive or if it’s more than 15GB or whatever the max is these days. Pay for storage for one month for a couple of dollars on one of the supported platforms and download from there.

    • gedaliyah@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      6
      ·
      5 months ago

      Looked promising until

      When Images are downloaded this strips EXIF location (according to the docs and my tests). This is a limitation of the Google Photos API and is covered by bug #112096115.

      The current google API does not allow photos to be downloaded at original resolution. This is very important if you are, for example, relying on “Google Photos” as a backup of your photos. You will not be able to use rclone to redownload original images. You could use ‘google takeout’ to recover the original photos as a last resort

  • Swarfega@lemm.ee
    link
    fedilink
    English
    arrow-up
    2
    ·
    5 months ago

    Not really helping you here. But when I started using Google Photos, I still manually downloaded files from my phone to local storage. I did this mainly to ensure I have the original copies of my photos and not some compressed image. Turns out that was a wise move as exporting photos from Google is a pretty damned awful experience.

  • rambos@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    5 months ago

    Im surprised that feature exist tbh. It worked fine for my 20GB splited into 2GB archives if I remember correctly

    • gedaliyah@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 months ago

      I used it for my music collection not that long ago and had no issues. The family’s photo library is an order of magnitude larger, so is putting me up against some of the limitations I didn’t run into before