Latest nightly builds of Firefox 139 include an experimental web link preview feature which shows (among other things) an AI-generated summary of what that page is purportedly about before you visit it, saving you time, a click, or the need to ‘hear’ a real human voice.

    • vermaterc@lemmy.mlOP
      link
      fedilink
      arrow-up
      1
      ·
      7 days ago

      It reportedly works entirely on your machine (as it meant to be privacy preserving by default). So it will probably see only the data you can see.

  • algernon@lemmy.ml
    link
    fedilink
    arrow-up
    0
    arrow-down
    1
    ·
    edit-2
    7 days ago

    I wonder if the preview does a pre-fetch which can be identified as such? As in, I wonder if I’d be able to serve garbage for the AI summarizer, but the regular content to normal views. Guess I’ll have to check!

    Update: It looks like it sends an X-Firefox-Ai: 1 header. Cool. I can catch that, and deal with it.

    • doodledup@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      7 days ago

      Definitely won’t be visiting your website then if you intentionally fuck with people to make their browsing experience worse. I hate web hosters who are against the free and open internet.

      • algernon@lemmy.ml
        link
        fedilink
        arrow-up
        0
        arrow-down
        1
        ·
        edit-2
        6 days ago

        Pray tell, how am I making anyone’s browsing experience worse? I disallow LLM scrapers and AI agents. Human visitors are welcome. You can visit any of my sites with Firefox, even 139 Nightly, and it will Just Work Fine™. It will show garbage if you try to use an AI summary, but AI summaries are garbage anyway, so nothing of value is lost there.

        I’m all for a free and open internet, as long as my visitors act respectfully, and don’t try to DDoS me from a thousand IP addresses, trying to train on my work, without respecting the license. The LLM scrapers and AI agents do not respect my work, nor its license, so they get a nice dose of garbage. Coincidentally, this greatly reduces the load on my backend, so legit visitors can actually access what they seek. Banning LLM scrapers & AI bots improves the experience of my legit visitors, because my backend doesn’t crumble under the load.

        • doodledup@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          6 days ago

          LLM scrapers? What are you on about? This feature will fetch the page and summarize it locally. It’s not being used for training LLMs. It’s practically like the user opened your website manually and skimmed the content. If your garbage summary doesn’t work I’ll just copy your site and paste it in ChatGPT to summarize it for me. Pretty much the equivalent of what this is.

          AI summaries are garbage anyway, so nothing of value is lost there.

          Your ignorance annoys me. It has value to a lot of people including me so it’s not garbage. But if you make it garbage intentionally then everyone will just believe your website is garbage and not click the link after reading the summary.

          • algernon@lemmy.ml
            link
            fedilink
            arrow-up
            0
            ·
            6 days ago

            This feature will fetch the page and summarize it locally. It’s not being used for training LLMs.

            And what do you think the local model is trained on?

            It’s practically like the user opened your website manually and skimmed the content

            It is not. A human visitor will skim through, and pick out the parts they’re interested in. A human visitor has intelligence. An AI model does not. An AI model has absolutely no clue what they user is looking for, and it is entirely possible (and frequent) that it discards the important bits, and dreams up some bullshit. Yes, even local ones. Yes, I tried, on my own sites. It was bad.

            It has value to a lot of people including me so it’s not garbage.

            If it does, please don’t come anywhere near my stuff. I don’t share my work only for an AI to throw away half of it and summarize it badly.

            But if you make it garbage intentionally then everyone will just believe your website is garbage and not click the link after reading the summary.

            If people who prefer AI summaries stop visiting, I’ll consider that as a win. I write for humans, not for bots. If someone doesn’t like my style, or finds me too verbose, then my content is not for them, simple as that. And that’s ok, too! I have no intention of appealing to everyone.

            • Captain Beyond@linkage.ds8.zone
              link
              fedilink
              arrow-up
              1
              ·
              6 days ago

              A human using a browser feature/extension you personally disapprove of does not make them a bot. Once your content is inside my browser I have the right to disrespect it as I see fit.

              Not that I see much value in “AI summaries” of course - but this feels very much like the “adblocking is theft” type discourse of past years.

  • lol@discuss.tchncs.de
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    7 days ago

    I dislike all these new and unnecessary AI integrations as much as the next person, but I could see this being useful to combat clickbait titles and resulting disinformation.

    Not having to spend five minutes reading a shitty “Why cashews are actually bad for you” article just to find the line where it admits that they really aren’t seems like a potential improvement to me.

  • queermunist she/her@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    7 days ago

    Hate having to read an article to understand what it’s saying and would rather read what an AI says it (potentially) says instead?

    This reads like satire.