Unable to reply
-
Continuing the discussion from /t/1000, the very slow race for 1005 title edits previously finished, it's off by one.:
It appears this thread has been blamed for some of the recent episodes of server-cooties, and as such it's been put 'on the naughty step' by being placed into a read-only category until
the scheduled work on improving large topics.
happens. (No - I've no idea when that will be.)
Sadly, until it does happen, this topic will have to remain read-only.
Should placing this topic in a read-only category prove insufficient, then it shall be moved to a staff-only area where it will remain invisible to most until stuff happens.
It was suggested that a new thread be started and the posts subsequently moved back into this thread after the remedial work was carried out, but a few quick tests suggest it might/may screw with post numbering/post count/cursor.
Doing this seems to be the best way of handling this at the moment.
Tried to reply to invoke the "4 days later" mark but found no reply button. Thread will now automatically locked after some days of inactivity?
-
No, it was locked manually by @PJH due to problems it may (or may not) be causing to Discourse. One may, I assume, "Reply as linked Topic," but not reply within the topic. I haven't tried, but I don't think you can even like existing posts. (Knowing Discourse, it will probably let you do so, and let you think you've succeeded, but not really. However, as I said, I haven't tried.)
-
I can successfully like the 10000th post there, though. So I guess the "like" button still works there.
-
Yeah seems liking, reading etc work just no posting :(
-
it's been put 'on the naughty step' by being placed into a read-only category
The answer's in the OP. Weird that liking works, though.
-
Weird that liking works, though.
Discourse. Where read-only actually only means read-only some of the time.
-
So I'm away for a few days and you guys manage to kill /t/1000?
I'm wondering if Discourse will be around when I'm back ...
-
I'm wondering if Discourse will be around when I'm back ...
Maybe it'll be gone and @ben_lubar will be working on how to import T-1000 into some other software.
-
-
due to problems it may (or may not) be causing to Discourse
But wasn't testing whether something like that thread will cause issues with Discourse the main purpose of that thread?
-
Sure. Let's say, for the sake of argument, that /t/1000 really did uncover a problem with Discourse and it was killing performance. During the time between the problem being found and the problem being fixed, all of WTDWTF suffers. Better to temporarily suspend the likes thread and make the rest of the forums usable.
At one point, I argued that it was better to let the forums suffer to put more pressure to get the bugs fixed, but that position proved unpopular.
-
-
I think it's a good time to finally join the like thread. Anyone has some good script they want to share?
-
- Read post
- If joke, laugh.
- Click on like.
- Scroll until at next post.
- GOTO 1
Doing it any other way is cheating both yourself and others.
-
Do you really expect me to click through all those posts?
-
Yes. Time's a wastin'.
-
I'll pay you five bucks if you do it for me.
-
-
Would be funny if post #18 of likes thread was empty/special in any way?
-
-
Addendum: @tar feature request: /t/3004 is automatically reserved by the system for a likes thread
Addendum 2: Import /t/1000 into t3k4 for
scienceraisinsprophetprofitAddendum 3: Strikethrough makes
e and cambiguous
-
-
Aye. All 1px of it :P
-
That's all you need
-
Let's say, for the sake of argument, that /t/1000 really did uncover a problem with Discourse and it was killing performance.
I think we should accept that there are serious discoproblems with large topics. I've pasted the top two URLs by server load. Note that even with /t/1000 inactive it's still #1. IIRC, the second line is what records all of the read times for everyone on the site. Less server time with 25 times the number of hits.
Top 30 urls by Server Load Url Duration Reqs --- -------- ---- GET /t/1000.json?include_raw=1&track_visit=true HTTP/1.1 8820.65 3104 POST /topics/timings HTTP/1.1 7980.85 80608
Hmm...that looks different than other topics. I don't see the query string stuff on other URLs. @PJH, maybe something fishy / abusive going on? Anybody recognize that as part of a Like script, maybe?
-
-
/t/1000.json?include_raw=1&track_visit=true
oh brother.......
that's the line sockbot uses to get all the post ids for a topic when reading whole topics.oh. no it isn't....that particular one is fetched only once per topic, then sockbot uses the returned list of post ids to get the rest in batches of 200Yes, but that's not a sockbot urlAlthough the URL you post is a bit odd...
the first call is toThe call you posted is/t/1000.json?include_raw=1&track_visit=true&post_ids=0
and subsequent calls (to get all posts) take the form of
/t/1000.json?include_raw=1&track_visit=true&post_ids[]=1&post_ids[]=2&...
/t/1000.json?include_raw=1&track_visit=true
but the one sockbot uses is:/t/1000/posts.json?include_raw=1&track_visit=true&post_ids=0
is it stripping those post_ids parameter? or is it perhaps a different bot? (0.xx uses a different form that does not pass include_raw)
-
is it stripping those post_ids parameter? or is it perhaps a different bot?
No clue. That's all the info I have. Perhaps someone with more access can figure out the IP address(es) associated with those.
-
No clue.
it's got to be a different bot, if it's a bot at all.
I just checked sockbot@legacy and it also uses the
/t/topic_id/posts.json
url form:
so if it's a bot it's not a sockbot, or it's a sockbot that someone either messed with or never updated. (i'm not bothering to check history to see if sockbot ever used that form of url, it's on github if you want to be bored looking through past revisions)
-
so if it's a bot it's not a sockbot, or it's a sockbot that someone either messed with or never updated. (i'm not bothering to check history to see if sockbot ever used that form of url, it's on github if you want to be bored looking through past revisions)
Discosearch....
https://what.thedailywtf.com/t/sockbot-rolling-dice-for-fun-and-not-profit-since-2014/3591/1184
-
that link is so going to break if that post gets rebaked.
also that url sockbot uses is also of a different form:
t/topicid/last.json?include_raw=1&track_visit=true
That's not a URL 2.0 uses at all.
-
It just came up in Discosearch, wasn't inspecting it in detail.
I wonder if it's the script
@ChaosTheEternalsomeone did which loaded a list of posts and liked them but can't find a copy of it. Or maybe the reading script someone else had.
-
This discussion seems to imply that it's not the large topic causing the issues, and it's not the bots causing the issues, but the bots and auto-likers partying on the large topic causing the issues.
-
There are other issues that happen just from users using the large topics. Those are just fairly minimal for now with our large topic temporarily boarded up.
-
the script
@ChaosTheEternalsomeone did which loaded a list of postsThe only scripts I've written are:
- my auto-like script (which only fiddles with the UI, so any loading is from Discourse, not the script);
- a quick one when I was trying to duplicate the multiple likes bug on a single post (which didn't get any list, it only did a
POST
targeted atpost_actions
and only with a fixed post ID).
-
Yeah, I realised that, hence the edit.
Left the mention so you weren't confused about any notifications.
-
Gotta keep my post count up, gotta keep that TL3.
Though I do have to say, in my Discosearches, the only references I can find to that
track_visits
seems to be in reference to the Sockbot reader module, unless someone else had to implement it as well in their own custom script, but I don't know anyone else who has one (besides maybe @darkmatter, but all I can find about his auto-reader is that it works like my auto-like script and automates the UI).
-
There used to be a number of other bot implementations, but I thought sockbot subsumed all of them. Might be one of those?
-
No clue. That's all the info I have. Perhaps someone with more access can figure out the IP address(es) associated with those.
A selection:
root@what:/var/discourse/shared/standalone/log/var-log/nginx# grep "include_raw=1&track_visit=true" access.log | head [26/Aug/2015:12:32:15 +0000] XXX.XXX.XXX.XXX "GET /t/50762.json?include_raw=1&track_visit=true HTTP/1.1" "SockBot/2.10.2 (Bewitching Burlap; owner:accalia; user:sockbot)" "topics/show" 200 27567 "-" 0.365 0.365 "sockbot" [26/Aug/2015:12:33:12 +0000] XXX.XXX.XXX.XXX "GET /t/50762.json?include_raw=1&track_visit=true HTTP/1.1" "SockBot/2.10.2 (Bewitching Burlap; owner:accalia; user:sockbot)" "topics/show" 200 30782 "-" 0.471 0.471 "sockbot" [26/Aug/2015:13:34:51 +0000] XXX.XXX.XXX.XXX "GET /t/1000.json?include_raw=1&track_visit=true HTTP/1.1" "SockBot/2.10.2 (Bewitching Burlap; owner:accalia; user:Zoidberg)" "topics/show" 200 1324897 "-" 3.046 3.065 "Zoidberg" [26/Aug/2015:13:34:57 +0000] XXX.XXX.XXX.XXX "GET /t/1000.json?include_raw=1&track_visit=true HTTP/1.1" "SockBot/2.10.2 (Bewitching Burlap; owner:accalia; user:Zoidberg)" "topics/show" 200 1324658 "-" 2.538 2.566 "Zoidberg" [26/Aug/2015:13:34:57 +0000] XXX.XXX.XXX.XXX "GET /t/1000.json?include_raw=1&track_visit=true HTTP/1.1" "SockBot/2.10.2 (Bewitching Burlap; owner:accalia; user:Zoidberg)" "topics/show" 200 1324986 "-" 2.868 2.868 "Zoidberg" [26/Aug/2015:13:34:57 +0000] XXX.XXX.XXX.XXX "GET /t/1000.json?include_raw=1&track_visit=true HTTP/1.1" "SockBot/2.10.2 (Bewitching Burlap; owner:accalia; user:Zoidberg)" "topics/show" 200 1324650 "-" 2.988 3.102 "Zoidberg" [26/Aug/2015:13:34:57 +0000] XXX.XXX.XXX.XXX "GET /t/1000.json?include_raw=1&track_visit=true HTTP/1.1" "SockBot/2.10.2 (Bewitching Burlap; owner:accalia; user:Zoidberg)" "topics/show" 200 1324689 "-" 2.817 3.050 "Zoidberg" [26/Aug/2015:13:35:01 +0000] XXX.XXX.XXX.XXX "GET /t/1000.json?include_raw=1&track_visit=true HTTP/1.1" "SockBot/2.10.2 (Bewitching Burlap; owner:accalia; user:sockbot)" "topics/show" 200 1324798 "-" 4.195 4.195 "sockbot" [26/Aug/2015:13:35:02 +0000] XXX.XXX.XXX.XXX "GET /t/1000.json?include_raw=1&track_visit=true HTTP/1.1" "SockBot/2.10.2 (Bewitching Burlap; owner:accalia; user:Zoidberg)" "topics/show" 200 1324753 "-" 4.826 4.826 "Zoidberg" [26/Aug/2015:13:35:04 +0000] XXX.XXX.XXX.XXX "GET /t/1000.json?include_raw=1&track_visit=true HTTP/1.1" "SockBot/2.10.2 (Bewitching Burlap; owner:accalia; user:Zoidberg)" "topics/show" 200 1324658 "-" 3.739 3.761 "Zoidberg"
-
-
huh......
okay then....
where is that coming from then?
/me wanders off to trawl through sockbot code
-
Just noticed, not sure why @Zoidberg is still interested in /t/1000...
-
Hooray! People are paying attention to me!
-
... and not for nice reasons...
-
Just noticed, not sure why @Zoidberg is still interested in /t/1000...
he's still set to autolike /t/1k, but not to binge. with no new posts that shouldn't be it....
it shouldn't
could be autoreader
looking into it.
-
Well, it dropped off of yesterday's activity.
-
ah. found where that call is coming from
https://github.com/SockDrawer/SockBot/blob/master/lib/browser.js#L558
Looks like it's pulling the topic for messages...
probably ebcause the topic is still getting likes...
hmm.... i'll need to change that i see.
I'll open a couple of issues that should remedy the load a fair bit.
-
Why hasn't /t/1000 been moved somewhere inaccessible to non-staff?
-
Why hasn't /t/1000 been moved somewhere inaccessible to non-staff?
Because I didn't want to arbitrarily hide it (since some would have not seen some messages posted since they'd last been here.)
I did leave that open as an option however, if simply making it 'read only' didn't help with the problems that were perceived to be attributed to it..
-
Well, if @accalia can get the "SockBot asks for the entire content of all 50k posts in the thread in each request" thing, that should help with a bit of it.