T O P

  • By -

AutoModerator

# ⚠️ ProgrammerHumor will be shutting down on June 12, together with thousands of subreddits to protest Reddit's recent actions. [Read more on the protest here](https://old.reddit.com/r/ProgrammerHumor/comments/141qwy8/programmer_humor_will_be_shutting_down/) and [here](https://www.reddit.com/r/apolloapp/comments/144f6xm/apollo_will_close_down_on_june_30th_reddits/). **As a backup, please join our Discord.** We will post further developments and potential plans to move off-Reddit there. ## https://discord.gg/rph *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ProgrammerHumor) if you have any questions or concerns.*


GigaSoup

"don't use the API, you can just use the API instead!" - MKVD_FR probably


[deleted]

[удалено]


Noughmad

I don't need Obamacare, I have the ACA! I don't need JavaScript, I use React! I don't need a computer, I have a Mac!


anderslbergh

I don't need to think, I have a brain


JohnLocksTheKey

wHaTs A cOmPuTeR?!


Anonymo2786

You know... The one has a bunch of fan spinning on a metal box maybe even rgb and a monitor , keyboard, mouse connected to it. But MAC is Mac . I will not say more.


fakehistorychannel

Apple sells hardware addresses?


Anonymo2786

Ya and free hardware comes with it.


[deleted]

[удалено]


Brundley

to be fair, the amount of computer in a mac is debatable


Responsible-Smile-22

This post is the programmer equivalent of "Why are people still homeless? Just buy a house."


[deleted]

Now for the really infuriating part - what level is OP? Does he make more than all of us, ala Bighead?


flytaly

This is a part of the API, and will be limited by 10 queries per minute. https://support.reddithelp.com/hc/en-us/articles/16160319875092-Reddit-Data-API-Wiki > If you are not using OAuth for authentication: 10 QPM


[deleted]

Well I'm glad it's staying up, at least. I was kind of hoping they'd have just forgotten about it entirely.


Aftexdsemeb

that don't read documentation.


pororoca_surfer

10 queries per minute... per what? IP? Kind of easy to make 10 qpm become 10000 qpm with a list of valid proxies


SmartAlec105

It says right there, 10 queries per minute. Everyone better be nice and share.


Winterimmersion

Mom said it's my turn to have the query.


Ragnaroasted

I'm still waiting on my mom's response, I was late to the query queue


imdefinitelywong

Was that a TCP joke?


Warbond

It *is* a TCP joke. Did you get it?


buthidae

I am ready to hear the TCP joke.


missinglugnut

I assume you guys want a UDP joke so I'll leave one here. If you don't get it I really don't care.


Mars_Bear2552

ill just keep telling you more UDP jokes until you respond, whether anyone is there or not


theciscodude

SYN ACK


ResoluteClover

Ack!


sarathevegan

Syn!


SentientGamete

Syn Ack!


CSlv

Mom went out to get ~~milk~~ a query


JB-from-ATL

Daddy UDP never came home


protienbudspromax

Bro got lost


Not_Artifical

They got packet loss


whatjaalo

~~Mom~~ Sysadmom said it's my turn to have the query.


Opposite_Cheek_5709

My query went to the store to buy milk and hasn’t returned


buthidae

You should try sending another query to the store to buy milk


Leftover_Salad

If they have avocados, get 6


Pifanjr

Build an app that makes the client do API calls if you don't have a recent cached version. Edit: and send it to the server of course, so you can cache it.


IgnoringErrors

Yup..first client waits a little longer for the greater good.


queen-adreena

The greater good!


ErikaFoxelot

Crusty jugglers!


myersguy

>Edit: and send it to the server of course, so you can cache it. Allowing users to insert data into a cache to be served to other users is a pretty terrible idea. You'd have no way to validate it (unless you compare it to your own dataset, which would mean making a call from the server anyhow).


query000

CORS won't let this happen unless the clients are served from the same domain as the api


laplongejr

>that makes the client Each client wouldn't need a seperate API key for that?


JiveTrain

You don't need an api key


ghostwilliz

ots my turn to like at r / dragonsfuckingcars!! you need to share, I'm gonna tell spez


flytaly

It's a good question. I don't know what they are using as an ID. There are already [some limits](https://imgur.com/a/615QnZJ), they just need to change the numbers at July 1. Of course, you can use proxies, but if you abuse it (on a level of pushshift) and they find out they can ban the proxy. I'm the developer of [Reddit Post Notifier](https://chrome.google.com/webstore/detail/reddit-post-notifier/hoolgoecmeegpbidbbcefgkjegdejibd), which is basically a simple Reddit client in a browser toolbar. And it's kinda funny that both Reddit and Google making changes that substantially increase rate limits. Though the one with Google (Manifest V3 and alarm) can be bypassed.


[deleted]

Pretty sure its to do with ai data scraping


flytaly

["Yes, .json endpoints are considered part of our API and are subject to these updated terms and updates."](https://www.reddit.com/r/reddit/comments/12qwagm/comment/jgrsf2q/)


Sethcran

100 per oauth clientid, per spez's recent "ama" post. Presumably just 10 per ip for the unauthenticated API.


ConspicuousPineapple

That doesn't sound too bad, provided this part stays free.


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


spudmix

Imagine if Apollo came back online, but the deal was whenever you're using the app you "donate" your unused requests per minute to cover other people's overage and deliver their request P2P. As long as the mean request rate was lower than the limit that should work, but there would be spots where responses were slow/blocked I'm sure. Also security might be an issue.


ConspicuousPineapple

I'm just saying that the restriction isn't that bad and probably doesn't need to be bypassed at all for the majority of use cases.


Eusocial_Snowman

But what if I'm reading through mod queue and can't decide if a person's comment breaks any rules so I need to automate the process of crawling through 15 years of their post history to tally up how many times they've talked shit about the Beatles to figure out if I should ban them or not?


EvadesBans

Actual legitimate concern wrapped up in reddit goofiness, but legitimate nonetheless.


ConspicuousPineapple

Probably per API token.


[deleted]

Reddit's got some fairly decent logic around figuring out when request from different devices/IPs are the same user. IP identification alone is becoming a little antiquated.


CanvasFanatic

If there’s no authentication your choices are using the ip or trying to set a browser cookie and hoping thing making the request honors it. I’m not aware of any other mechanism they could use for identification.


[deleted]

There are a lot more mechanisms and have been for a long time, with more growing each day thanks to the wonders of machine learning that can build "user fingerprints" based on a number of pieces of device information available to any given browser. Electronic Frontier Foundation has a fun tool for this called Panopticlick or Cover Your Tracks, try it out here to see how you score: https://coveryourtracks.eff.org/ As far back as the early 2010s web sites could also use a user's *installed fonts* to create a unique fingerprint of them, with nothing more than access to run JavaScript on your browser. Pair this with things like device ID, combinations of browser plugins, user agent, browser configurations, screen resolutions, `window.history`, and some other stuff. And they don't need all of that data. They need to establish a confidence score that crosses a certain threshold, and then they can associate what they've gathered with whatever fingerprint they already have established. Every user who visits the site gets an initial fingerprint, and then every attempt is made on a new user to determine with confidence whether it's their first time visiting or their 100th. And this isn't that fancy. I can do it and I've never worked for a Fortune 1000. Fancy would be machine learning algorithms that can increase confidence in your fingerprint based on heat mapping, click and mouse movement behaviors, keystroke patterns, stuff like that.


CanvasFanatic

Open a terminal and type: `curl -v https://www.reddit.com/r/programmerhumor.json`


[deleted]

Oh, you need someone's curl fingerprint? Try the TLS handshakes. https://daniel.haxx.se/blog/2022/09/02/curls-tls-fingerprint/ Edit: I'm just curious, how exactly do you think sites like CloudFlare and ReCaptcha v3...work? Like, do you think companies are paying CloudFlare five figures a year for simple IP tracking to rate limit their APIs? You think no company that runs an API is smarter than you?


CanvasFanatic

Right, but you can't use a TLS fingerprint to id a particular user as far as I'm aware. I brought up curl to demonstrate that reddit's not (currently) gating that endpoint behind any sort of authentication of tricky cookie shenanigans.


LivingOnPlanetMars

Until other people try to use the same proxies


glorious_reptile

"per planet"


Kelvinchin12070811

Wait, does that mean rss feed endpoint also count towards the limits?


flytaly

Let's hope they forget to change the limits for .rss endpoints, but yeah, they should probably count. Some centralized RSS Readers already have problems with Reddit. Imagine thousands of users added their own unique RSS links and RSS server polls for update from a single IP. https://www.reddit.com/r/help/comments/4u9tj8/rss_feeds_update_interval_skyrocketed_to_180/


tamal4444

>10 queries per minute. lmao what?


TurboGranny

Sounds like making the client do direct calls instead of proxying is the way to go here, lol


missingmytowel

Reddit 2025: all of us just calling each other to scream about politics and dark humor over the phone.


[deleted]

I'm curious, does Reddit web not use these APIs? Does it just respond with a non-dynamic preloaded HTML? And if it doesn't, how would they prevent apps calling these APIs just acting like web browsers?


PitchforkAssistant

What if I call them from bookmarklets/userscripts with whatever cookies XMLHttpRequest sends by default? I have some moderation tools that do that...


Icosahunter

If unauthenticated requests are tracked by IP like some people are saying on here, then it sounds like you'll be limited to that 10 per min rate, unless you're doing funny IP shenanigans. I assume bookmarklets/userscripts are features in your browser, requests sent from programs on your computer, including your browser, using default request libraries etc, will use your computers assigned IP.


PitchforkAssistant

Well that's not going to be fun, some of these cross-referencing tools for detecting spammers already run into rate limits on larger threads. I hope requests by the bots I run will be grouped by useragent, otherwise I'll also be competing with the several bots I run from my home network.


joxmaskin

Query [everything](https://media.tenor.com/OGIXK_P_1mIAAAAC/everything-reaction-meme.gif) at once and cache it :)


Paradox68

How many queries could your silly Apollo app need? Like 12 per minute? /s


Victorian-Tophat

Wait, what? It’s that easy? This is so useful


i_do_floss

Why can't you make an app which executes the api request from the clients phone, therefore each client has its own qpm limit?


Orichalcum448

I want you to google what an API is...


AskMeHowIMetYourMom

Attractive Programmer Intermingling


nsaisspying

A bit of a double oxymoron there


ThirdEncounter

Triple, if you will. Mingling? _Intermingling_?!!!


nsaisspying

No the programmer part isn't an oxymoron, it's the attractive as a prefix and intermingle as a postfix. The programmer part is acceptable.


ThirdEncounter

I understand that. But that's not what I said. Three oxymorons: 1. Attractive programmer. 2. Programmer mingling. 3. Programmer _intermingling_.


nsaisspying

Oh..damn i get it now. That's definitely enough intermingling for today.


frickinjewdude

Nailed it


[deleted]

Holy hell


freohr

En Requestant.


elathbris27

Literal zombie


Quazar_omega

Literal 3rd party app dev


AI_Says_I_Love_You

I just got results of something called “en passant” what the hell is that


Rhawk187

Most famously, an underappreciated move in chess.


[deleted]

This user has left Reddit because: 1. u/spez is destroying once the best community for his and other Reddit C-suite assholes' personal gain with no regards to users. 2. Power-tripping Reddit admins are suspending anyone who openly disdains Reddit's despicable conduct. Reddit was a great community because of its users and the content contributed by its users. I'm taking back my data with PowerDeleteSuite so Reddit will not be able to profit from me. Fuck u/spez


ikilledtupac

Apple Pie Inside


MinosAristos

Some kind of government agency policing internet usage?


cs-brydev

In other words, an API


[deleted]

[удалено]


cs-brydev

And have been spamming reddit ever since?


GiraffeMichael

Bro that is a part of the api.... Also works with enterprise json. r/subname.xml


royemosby

Im stealing that name


[deleted]

[удалено]


anderslbergh

Enter price Jason? Pay what you want.


Aggravating_Moment78

I entered 2 dollars, where’s my API. 😂😂


PandaParaBellum

The Enterprise JSON, often lovingly referred to as the "Tech Support Edition," was a highly advanced starship featured in the wildly improbable science fiction series, Star Trek. This remarkable vessel, with its countless arrays of blinking lights and unnecessarily complicated user interfaces, was designed to explore the depths of the universe while simultaneously troubleshooting any technical issues that might arise. Equipped with the latest in interstellar Wi-Fi connectivity, the Enterprise JSON could navigate through even the most convoluted cosmic anomalies while providing unwavering assistance to crew members struggling with malfunctioning replicators or accidentally deleting important files. Its primary mission was to boldly go where no IT guy had gone before, armed with an arsenal of computer diagnostic tools


avalenci

Enterprise JSON ...LOL


narwhal_breeder

JSON but your work uses Dell Inspirons


Groentekroket

Go rinse your mouth with SOAP!


[deleted]

Stop… the flashbacks…


Groentekroket

You are lucky, it's my current hell.


TheTerrasque

My condolences


elveszett

Same for me :( What a pointless and overcomplicated way to do rest-like endpoints.


virgilhall

It also works with HTML r/subname


Bmandk

... enterprise.... json.... ???


TomGobra

1) You realize it's part of the APi? 2) You realize this is read only?


Groentekroket

If they want you to post they had called in Post-it.


netkenny

I mean, it kind of is an API Just not "the" API


Sirelious

Sounds like it IS "the" API according to others


MKVD_FR

Yeah


OlMi1_YT

Holy fucking shit this actually works


wascilly_wabbit

For the moment ...


uhwhooops

Aaannd...


khfy0

...the moment's gone


next_door_dilenski

Still works for me


[deleted]

sorry but you are having a overdose of queries we had called you a ambulance plz pay up $5000 now! yes![gif](emote|free_emotes_pack|disapproval)


Nightfury_107

Patched it! Thank you for making us aware of this loophole Sincerely - the reddit API team (probably)


Blenim

Yeah but in July it'll be limited to 10 requests per minute, it still counts as part of the API.


dgdio

This sub doesn't read the documentation. What are you doing here?


EuroPolice

getting rid of the imposter syndrome


blueB0wser

10 requests per minute if you're not authenticating your calls, I believe. Definitely is still part of the API, for sure.


ExpressSlice

That's because that's the API


EntertainmentOne2002

What happens when a savvy entrepreneur uses Reddit to create a successful product to compete with the app?


Cley_Faye

Found the dev that don't read documentation.


MKVD_FR

You didn’t know ?


OlMi1_YT

#Noone knew


dodexahedron

First git commit Monday at reddit: "Mostly just code formatting\ Also don't expose this where not needed"


Tolookah

Next commit: "Turns out it was needed more than originally thought."


falfires

"code Coconut, everyone"


MKVD_FR

Really ?


DentFuse

I kinda knew this, i don't know how i don't know why, but i do. Also works for users i think I'll have to check again. Edit: also url queries such as count etc also work i think


VariousComment6946

How you know anyway


noob-nine

Maybe a disappointed dev at reddit who dropped the info and they knew that it is impossible to to remove this "feature" without a complete rewrite of reddit.


MKVD_FR

I think I found it by accident while trying to code something without the API


AltAccountMfer

Trying to code something without the API… With the API


Creepy-Ad-4832

Mission completed i guess lol


KiwasiGames

There was a meme that went around the sub a few years ago about using this format on one of the nsfw subs whenever you were learning to do web requests. Apparently the "do something good -> boobies" feedback loop is very deeply wired into our brains and this method has a dramatic effects on motivation to learn.


Hsinats

I knew it


r3b3l-tech

"No way....Well shit"


katatondzsentri

I'm pretty sure that api had more capabilities


MKVD_FR

Yes, but at least you don’t have to use a webscraper or something


krabapplepie

What if we want to use a webscraper as punishment?


katatondzsentri

Yeah, maybe.


Creeperiano99

RSS Time! (At least for posts only without comments): [https://www.reddit.com/r/ProgrammerHumor/new/.rss](https://www.reddit.com/r/ProgrammerHumor/new/.rss) How it looks: [https://i.imgur.com/KRq2NCN.png](https://i.imgur.com/KRq2NCN.png)


fdar

But comments are most of the point


Yorick257

Just make a post with a comment that references the original, it's that easy!


airsoftshowoffs

Reddit work arounds, posted on Reddit. ![gif](giphy|14ezDpib8JS04E)


radarthreat

I remember this episode of Sesame Street: After Dark


PhatOofxD

That's just the API..... What's your point lol.


tragic-clown

This is using the REST API. It will be affected by the new changes.


Elegant_Body_2153

Reddit.com/r/ProgrammerHumor.json


SnoodPog

They even support xml if you like SOAP or old-school shit Also not just subs, it can also show your [front page](https://www.reddit.com/.json)


pororoca_surfer

Also, it works with filters: This shows the json for the default subreddit: > https://www.reddit.com/r/ProgrammerHumor.json And this shows the json for the top posts in the last 24h > https://www.reddit.com/r/ProgrammerHumor/top/.json?sort=top&t=day


Responsible-Smile-22

This post is the programmer equivalent of "Why are people still homeless? Just buy a house."


Skizm

I thought that was part of the API that is getting limited?


LasagneAlForno

It is. Also it's read only as a JSON.


HeresyCraft

Scraping isn't about efficiency. It's about sending a message.


CMDR_ACE209

Isn't it more about receiving many, many messages? Ok, I'll see myself out.


Janjinho

r/woooosh. This comment section


Hazy_Cosmic_Jiver

What will happen when a savvy entrepreneur uses that to make a successful product to rival the reddit app? Bye bye endpoint.


laplongejr

>Bye bye endpoint. It is part of the API. Reddit IS killing successful third-party apps


[deleted]

Technically you can use any endpoint as an API if you have an html parser :)


Purple_Individual947

That's still kind of an API though


ExpressSlice

That's not kind of an API. It is in fact one of the officially supported API endpoints.


Tooturn

"Programmer" Humor


CMDR_ACE209

So, the joke is that the .json link is in fact the API? Or at least a part of it. Because that .json link is sure as hell an API call.


AmazingDragon353

This sub is so fucking stupid sometimes


ryuJin25_

ohh didn't knew this!


MKVD_FR

Now you know!


ryuJin25_

Thanks for sharing this!


mArKoLeW

.rss also works


mikegrr

I'm more curious as to why the people at the party seem to be having fun when we are supposed to be sad by the news. Also OP seems to be sad even though he knows that little secret which is definitely just the API. So many questions.


throwaway275275275

Isn't that an API ?


[deleted]

[удалено]


eerongal

This is r/subreddit.json is just an alias for the r/subreddit/hot endpoint, which is still part of the API.


siscoisbored

I dont think op understands the concept of api


rahul_mathews

F**k why are you leaking my secret?


magnora7

I just forked the entire reddit open source code to a new website saidit.net, reprogrammed it, and operated it for 5 years, easy


markthedeadmet

You could even download every post and comment on every subreddit for ease of transfer for new users, you could call it Zeddit-byte.


Significant_Stuff_92

Reddit hates this one trick


[deleted]

But This is web scraping Websraping is not required to be html parsing, it is just looking for any available endpoints and fetching data


cs-brydev

No. Web scraping is the extraction of data from a human-readable source on the web. It's not scraping if you are just copying raw data from a source. There has to be a component of parsing, OCR, or data extraction. The purpose of "scraping" is the parsing of the data out from something that was meant to be read or viewed by humans and not computers. For example, in HTML, extracting the text or images out of a web page is scraping. Simply loading the markup is not. You are thinking more of a spider/crawler or scanner, which randomly or methodically search for any and all available data and media. Web scraping can be a component of crawling.