During some work with Tess, I’d notice that my test instance was running horribly slow. The CPU was spiking, Postgres was not happy and using pretty much all the available compute.

Investigating, I found the culprit to be some crawler or possibly malicious actor sending a massive number of unscoped requests to /api/v3/comment/list. What I mean by “unscoped” is without limiting it to a post ID. I’m not sure if this is a bug in Lemmy or there’s a legit use for just fetching only comments outside of a post, but I digress as that’s another discussion.

After disallowing unscoped requests to the comment list endpoint (see mitigation further down), no more issue.

The kicker seemed to be that this bot / jackass was searching by “Old” and was requesting thousands of pages deep.

Requests looked like this: GET /api/v3/comment/list?limit=50&sort=Old&page=16413

Since I shutdown Dubvee officially, I’m not keeping logs as long as I used to, but I saw other page numbers in the access log, but they were all above 10,000. From the logs I have available, the requests seem to be coming from these 3 IP addresses, but I have insufficient data to confirm this is all of them (probably isn’t).

  • 134.19.178.167
  • 213.152.162.5
  • 134.19.179.211

Log Excerpt

Note that I log the query string as well as the URI. I’ve run a custom Nginx setup for so long, I actually don’t recall if the query string is logged by default or not. If you’re not logging the query string, you can still look for the 3 (known) IPs above making requests to /api/v3/comment/list and see if entries similar to these show up.

2025-09-21T14:31:59-04:00 {LB_NAME}: dubvee.org, https, {LB_IP}, 134.19.179.211, - , NL, Amsterdam, North Holland, 52.37590, 4.89750, TLSv1.3, TLS_AES_256_GCM_SHA384, "GET", "/api/v3/comment/list", "limit=50&sort=Old&page=16413"
2025-09-21T14:32:00-04:00 {LB_NAME}: dubvee.org, https, {LB_IP}, 134.19.179.211, - , NL, Amsterdam, North Holland, 52.37590, 4.89750, TLSv1.3, TLS_AES_256_GCM_SHA384, "GET", "/api/v3/comment/list", "limit=50&sort=Old&page=16413"
2025-09-21T14:32:01-04:00 {LB_NAME}: dubvee.org, https, {LB_IP}, 134.19.179.211, - , NL, Amsterdam, North Holland, 52.37590, 4.89750, TLSv1.3, TLS_AES_256_GCM_SHA384, "GET", "/api/v3/comment/list", "limit=50&sort=Old&page=16413"
2025-09-21T14:32:01-04:00 {LB_NAME}: dubvee.org, https, {LB_IP}, 134.19.179.211, - , NL, Amsterdam, North Holland, 52.37590, 4.89750, TLSv1.3, TLS_AES_256_GCM_SHA384, "GET", "/api/v3/comment/list", "limit=50&sort=Old&page=16413"
2025-09-21T14:32:12-04:00 {LB_NAME}: dubvee.org, https, {LB_IP}, 134.19.179.211, - , NL, Amsterdam, North Holland, 52.37590, 4.89750, TLSv1.3, TLS_AES_256_GCM_SHA384, "GET", "/api/v3/comment/list", "limit=50&sort=Old&page=16413"
2025-09-21T14:32:13-04:00 {LB_NAME}: dubvee.org, https, {LB_IP}, 134.19.179.211, - , NL, Amsterdam, North Holland, 52.37590, 4.89750, TLSv1.3, TLS_AES_256_GCM_SHA384, "GET", "/api/v3/comment/list", "limit=50&sort=Old&page=16413"
2025-09-21T14:32:13-04:00 {LB_NAME}: dubvee.org, https, {LB_IP}, 134.19.179.211, - , NL, Amsterdam, North Holland, 52.37590, 4.89750, TLSv1.3, TLS_AES_256_GCM_SHA384, "GET", "/api/v3/comment/list", "limit=50&sort=Old&page=16413"
2025-09-21T14:32:13-04:00 {LB_NAME}: dubvee.org, https, {LB_IP}, 134.19.179.211, - , NL, Amsterdam, North Holland, 52.37590, 4.89750, TLSv1.3, TLS_AES_256_GCM_SHA384, "GET", "/api/v3/comment/list", "limit=50&sort=Old&page=16413"

Mitigation:

First, I blocked the IPs making these requests, but they would come back from a different one. Finally, I implemented a more robust solution.

My final mitigation was to simply reject requests to /api/v3/comment/list that did not have a post ID in the query parameters. I did this by creating a dedicated location block in Nginx that is an exact match for /api/v3/comment/list and doing the checks there.

I could probably add another check to see if the page number is beyond a reasonable number, but since I’m not sure what, if any, clients utilize this, I’m content just blocking unscoped comment list requests entirely. If you have more info / better suggestion, leave it in the comments.

location = /api/v3/comment/list {

  # You'll need the standard proxy_pass headers such as Host, etc. I load those from an include file.
  include conf.d/includes/http/server/location/proxy.conf;

  # Create a variable to hold a 0/1 state
  set $has_post_id 0;

  # If the URL query string contains 'post_id' set the variable to 1
  if ($arg_post_id) {
    set $has_post_id  1;
  }

  # If the variable is not 1 (i.e. does not have post_id in the arguments), return 444
  # 444 is an Nginx-specific return code that immediately closes the connection 
  # and wastes no further resources on the request
  if ($has_post_id != 1) {
    return 444;
  }

  # Otherwise, proxy pass to the API as normal 
  # (replace this with whatever your upstream name is for the Lemmy API
  proxy_pass "http://lemmy-be/";
}
  • OpenStars@piefed.social
    link
    fedilink
    English
    arrow-up
    7
    ·
    5 days ago

    This is for Lemmy I presume (or also for Piefed or Mbin)? You’ve modified yours heavily though, I thought, which could complicate matters. I wonder if you are having those bot scraping issues that semi-recently (a month or so ago?) started increasing in frequency. So many instances now have a human detector before letting you in whereas before it was not necessary.

    • freamon@preferred.social
      link
      fedilink
      English
      arrow-up
      4
      ·
      5 days ago

      PieFed has a similar API endpoint. It used to be scoped, but was changed at the request of app developers. It’s how people browse sites by ‘New Comments’, and - for a GET request - it’s not really possible to document and validate that an endpoint needs to have at least one of something (i.e. that none of ‘post_id’ or ‘user_id’ or ‘community_id’ or ‘user_id’ are individually required, but there needs to be one of them).

      It’s unlikely that these crawlers will discover PieFed’s API, but I guess it’s no surprise that they’ve moved on from basic HTML crawling to probing APIs. In the meantime, I’ve added some basic protection to the back-end for anonymous, unscoped requests to PieFed’s endpoint.

      • mapto@feddit.bg
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 days ago

        Could you elaborate on this:

        it’s not really possible to document and validate that an endpoint needs to have at least one of something

        In what sense it is not possible, as I can easily see it done in the code?

        • freamon@preferred.social
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 days ago

          It’s straight-forward enough to do in back-end code, to just reject a query if parameters are missing, but I don’t think there’s a way to define a schema that then gets used to auto-generate the documentation and validate the requests. If the request isn’t validated, then the back-end never sees it.

          For something like https://freamon.github.io/piefed-api/#/Misc/get_api_alpha_search, the docs show that ‘q’ and ‘type_’ are required, and everything else is optional. The schema definition looks like:

          /api/alpha/search:
              get:
                parameters:
                  - in: query
                    name: q
                    schema:
                      type: string
                    required: true
                  - in: query
                    name: type_
                    schema:
                      type: string
                      enum:
                        - Communities
                        - Posts
                        - Users
                        - Url
                    required: true
                  - in: query
                    name: limit
                    schema:
                      type: integer
                    required: false
          

          required is a simple boolean for each individual field - you can say every field is required, or no fields are required, but I haven’t come across a way to say that at least one field is required.

          • mapto@feddit.bg
            link
            fedilink
            English
            arrow-up
            2
            ·
            4 days ago

            Ah, I see, so you are talking about this.

            Of course it is nice if things get auto-generated, but doing it yourself, both in code and documentation should never be excluded as an option.

    • Admiral Patrick@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      5 days ago

      Lemmy. I added a comment above since LW wouldn’t let me edit the post.

      Mine’s only extended with some WAF rules and I’ve got a massive laundry list of bot user agents that it blocks, but otherwise it’s pretty bog standard.

      If instances have Anubis setup correctly (i.e. not in front of /api/...) then that might not help them since this is calling the API endpoint.