I would love the child of a Surfacebook with a Framework laptop; or A bare keyboard attached to a screen, that I could plug my phone (possibly running Phosh) and use it as a hardware for a laptop experience

  • TheImpressiveX@lemmy.ml
    link
    fedilink
    arrow-up
    24
    ·
    1 year ago

    You know how Ctrl+F helps you find specific words in browsers? I want that in real life.

    Maybe some special glasses with this ability built-in?

    • utopiah@lemmy.ml
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      Pretty trivial technically speaking, you record everything once you get people consent, then you transcribe with e.g whisper.cpp or whatever else you have, search within the transcriptions and generate a link back to the original files, if need be, with seeking timing to double check.

    • Couplqnd@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      Take a look at rewind.ai

      Cool software for the mac and they plan to release a personal device that records everything and do what you ask plus more

  • Sparking@lemm.ee
    link
    fedilink
    English
    arrow-up
    21
    ·
    edit-2
    1 year ago

    Just an open source e-ink device with the build quality of a Kindle. Nothing fancy.

    • utopiah@lemmy.ml
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      reMarkable, been using gen 1 then 2 for years now, runs on Linux and active dev community

      less slick and much smaller community but the PineNote also works with Linux, kind of.

      • Sparking@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Remarkable looks cool, but I was talking about a dedicated e-reader. They probably won’t bother because their differentiator is the writing.

        There needs to be one that is kindle adjacent, ru s linux, and comes with a ton of selections from project gutenberg, selling a little bit above cost. Thats the only way I could see this working.

      • daddyjones@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Remarkable would be awesome if I could read my Kindle books on it. It seems to me that most e-ink tablets are good at either taking notes or ebooks, but none are really good at both…

  • Etterra@lemmy.world
    link
    fedilink
    arrow-up
    18
    ·
    1 year ago

    I want an orbital laser weapon that targets people who drive like insane maniacs or raging assholes.

  • Apeman42@lemmy.world
    link
    fedilink
    English
    arrow-up
    17
    ·
    1 year ago

    A reverse microwave. I can heat a cup of coffee in 30 seconds, I want to chill a beer in 30 seconds.

    • Raiderkev@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      Ice, water, and salt will do the trick in about 5 minutes. Best you’ll get in this no reverse microwave having ass world.

      • spader312@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        A machine that spins your can inside of water and ice exists and chills a drink within like a minute

  • CodeMonkeyUK@lemmy.world
    link
    fedilink
    arrow-up
    17
    arrow-down
    1
    ·
    1 year ago

    A device to allow me to spread cream cheese evenly on a bagel without getting all over my fingers.

    I’m thinking some sort of rotating bagel mount with a silicone tapered spike.

  • MrFunnyMoustache@lemmy.ml
    link
    fedilink
    arrow-up
    16
    ·
    1 year ago

    I would like a flagship spec (especially RAM, give me all the RAM possible) phone with a small screen and a massive battery life; sacrificing other components to put a big battery. One small camera is enough for me, I don’t need 3 cameras on the back of my phone. I would also get a small, single speaker to save internal space, and remove the haptic system entirely in exchange for a larger battery.

    I wouldn’t sacrifice the headphone jack though, I hate using dongles.

      • MrFunnyMoustache@lemmy.ml
        link
        fedilink
        arrow-up
        5
        ·
        1 year ago

        It’s been over a decade since I had a phone with an SD card slot, so I kinda forgot they exist, but you’re right, this is an extremely useful feature.

  • Dagwood222@lemm.ee
    link
    fedilink
    arrow-up
    13
    ·
    1 year ago

    Niche, but I want it. It would look like a blank book, with pages that feel like paper. I’d be able to download whatever text I wanted, and read it like an old fashioned book. You’d be able to change the text as many times as you wanted to.

    • utopiah@lemmy.ml
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      FWIW I’m using the reMarkable 2. It runs Linux and pretty fast eInk for sketching and writing notes. It’s not paper but closest to it I tried so far.

  • j4k3@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    1
    ·
    edit-2
    1 year ago
    • Open source motherboards
    • Open source modems for computers and phones
    • Open source cars
    • GrapheneOS phone with enough RAM to run a decent offline LLM
    • Offline AI privacy/network manager designed to white noise the stalkerware standards of the shitternet with a one click setup
    • Real AI hardware designed for tensor math using standard DIMM system memory with many slots and busses in parallel instead of bleeding edge monolithic GPU stuff targeting a broad market. The bottle neck in the CPU structure is the L2 to L1 cache bus width and transfer rate with massive tensor tables that all need to run at one time. System memory is great for its size but its size is only possible because of the memory controller that swaps out a relatively small chunk that is actually visible to the CPU. This is opposed to a GPU where there is no memory controller and the memory size is directly tied to the compute hardware. This is the key difference we need to replicate. We need is a bunch of small system memory sticks where the chunk normally visible to the CPU is all that is used and a bunch of these sticks on their own busses running to the compute hardware. Then older, super cheap system memory could be paired with ultra cheap trailing edge compute hardware to make cheaper AI that could run larger models, (at the cost of more power consumption). Currently larger than 24GBV GPUs are pretty much unobtainium, like an A6000 at 48GBV will set you back at least $4k. I want to run a 70B or greater. That would need ~140GBV to run super fast on dedicated optimised hardware. There is already an open source offline 180B model, and that would need ~360GBV for near instantaneous response. While super speeds with these large models is not needed for basic LLM prompting, it makes a big difference with agents where the model needs to do a bunch of stuff seamlessly while still appearing to work in realtime conversationally.
  • dsemy@lemm.ee
    link
    fedilink
    English
    arrow-up
    10
    ·
    1 year ago

    A modern smartphone (good battery, screen, etc.) running an alternative OS (like Linux or OpenBSD) with the ability to run Android (or iOS) apps I unfortunately need to use.

    I recently switched to a Pixel 7a with GrapheneOS and while it’s nice, I still really hate the locked down nature of Android (and iOS).