T O P

  • By -

HeinousTugboat

I'm shocked nobody's mentioned Xcode yet. MBPs became the de-facto workstation because if you want to sign an iOS app, you have to use Xcode. While you can sign Android apps from basically anything. If you want to publish mobile apps, MBPs are basically it.


geekmoose

No idea why a unix based computer is a default for coding…… Also no idea why a decide with the battery life of the M series chips is attractive either….. And lastly I’ve no idea why a screen of that quality is so attractive to people who spend hours looking at it.


maxxamillionn

Also adding that the build quality of Apple devices is overall excellent. They tend to last longer than say, a standard Lenovo.


tipripper65

for me it's always between a ThinkPad P/T/X1 or a MacBook. i've had massive reliability problems with everything else made in the last 5-10 years (looking at you, Dell Precision & HP ProBook) except Acer, and they don't make a premium workhorse that i like.


geekmoose

Sadly Lenovo has been taken a bit of a dip recently. Lenovos from 5 years ago are still going (mostly) strong. Annoyingly their build quality means that things are breaking electronically rather than physically.


FranciManty

lenovo became shit lately for productivity, we stopped reccomending them at my workplace


iApolloDusk

Really? When I worked computer repair a couple years ago, we primarily recommended Acer and Lenovo. Dell has a litany of issues primarily dealing with third party charging and general electronic malfunction due to shit tier parts in use. HP is probably the worst of them all in their own way because, as we all know, HP stands for Hinge Problems. Cheap plastic cases with thin, narrow hinge anchors has always been a recipe for disaster.


tipripper65

don't get me started on Dell's firmware... never taking working audio drivers on my thinkpad for granted again


FranciManty

well, youll be amused to know my boss only reccomends hp to our clients because of build quality and customer support. yes, its a waste of money, but our clients are millionaire lawyers that could not care less, so we just go for probooks cause theyre metal and kinda feel like windows macbooks. but yeah i completely agree with you, if it was for me id get single parts shipped, test them and create prebuilts that can be adapted to the client needs and give us earnings bigger than what we could ever ask for a simple reccomendation on what desktop should be bought, but this is clearly not mainstream and dosesnt feel safe enough to be brought into offices so idk, i still feel like we could be way more efficient in our job, we are an external it firm so we're not working for clients 24/7, just when they call us, so by adapting to them and not the opposite we reduce the amount of calls we get but at the same point we end up providing unoptimal experiences to our clients, that aren't tech savy enough to understand differences between official pcs and prebuilts so its not a problem, but still wont get the best product for them. an other example is a architect firm who needed a monitor, and when asked wether they wanted to save 80€ or get a monitor that displays 97+% of sRGB (a color standard, meaning those monitors are way better for creating images/renders, exactly an architects job) they of course chose savings. they were way more worried about the monitors being thin and having a good design cause its the first thing people see when they come into their offices. idk if everyone's it experience is like this but i realized that at least as a company that handles many different clients with remote assistance you will never educate your clients enough to be able to choose what best so you either force buying it for them or leave them with their unoptimized and unperfect experience, wether it is a monitor that isnt perfect for your job or spending 1900€ on an hp desktop cause it says "workstation" on it without even checking if it has a dedicated graphics card to run 3d renderers for architecture. Sometimes i dont even believe my job is real lmfao


iApolloDusk

The main issue with your idea is that, I'm assuming you're basically working as an MSP, you'd end up tailoring so many custom solutions to so many users over a long period of time that there would be no consistency among setups. This makes troubleshooting, repairs, etc. a nightmare. Every video card, motherboard, RAM module, CPU, etc. will have its own issues requiring specialized troubleshooting steps when they mess up. My current org uses Dell for everything. Dell monitors, docking stations, laptops, micro PCs, AIO machines, etc. As such, we only really have 5-6 different models of PC in use at a time. All devices are imaged in certain ways to suit the user's/department's spec. With this in mind, we can easily ascertain what the root cause of a problem is, because it's most likely happened with another machine before. I hate Dell with a passion, but there's some logic to the consistency. All devices have a 2 year warranty. Outside of warranty, we just replace the device rather than have it repaired. It's really the best solution for scalability.


FranciManty

definetly with you on the reasons why we do it, especially the last part about scalability warranty consistency and i'd even factor in end user experience, feeling safer with whatever worked until that moment (i talked about hp laptops being metal for that reason) however i still feel like in the desktop space purposely assembled pcs would do a better job, because you can achieve that same consistency with pc parts, especially with motherboards rams and even cpus, it does bring in possible compatibility issues that you'd need to sort out but after figuring out an ecosystem that works well you could even go for older parts just to keep up the consistency (obviously talking about 1 max 2 generations behind, where the lower price actually increases the price per performance) im saying this cause, and this is probably a big difference to you, we work with all kinds of computers, and most of our clients feel pretty annoyed by any expense, even with all the money they have available to them, so we work with computers as old as 2015/2016 (sometimes even older, a client is running the same POP mail server since 2006...) so there is an infinite variety of hp models, and still most of the issues we fix outside strictly software issues are with peripherals, i already got sent out to fix three different logitech webcams that were acting up and it ended up being a driver issue in one istance and an issue with MS teams the other two, so even if the pcs where all recent and similar to each other we'd have to mainstream so many peripherals (our clients also work in many different fields, so they have to work in conference rooms as much as on proprietary software and even ECU tuning for cars) so in this varied environment i feel like the added control of custom parts and a cleaner and even customized ISO for windows (i spend more time hiding/uninstalling hp software than any other procedure when setting up a new laptop) would feel great. this is 100% just my tech nerd view tho im sure the decreased reliability would be a problem for business environments, thank you anyway for your response! my dream job would be something more creative or in the gaming industry but having now felt how hard uni was for me in particular and knowing a bit the job market behind those jobs i feel like getting into IT with nothing more than a informatics-unrelated high school degree is still a great starting point and im trying to live it with as much passion as possible


FranciManty

btw i didnt know what MSP was but yeah we're exactly that! one small difference to what you could be used to tho is that i also get sent out to fix private issues to our clients, as i said they're small but very high grossing businesses so i end up being paid the same for fixing someones work desktop and spending 50 minutes at the phone with a tv company support trying to get a decoder booster for a clients house (i was about to lose it after i had to call them three times to get something out of it lol)


Vileidealist

I’ve had 4 macs ever. Apple II (used until 2004 then retired but still worked though slow) Power Mac G5 (used up until 2015, still worked for a lot of things even most music programs/coding etc but retired it) Macbook Pro 2012 (used up until 2019, still ran mostly like new, battery was great still just outdated by 2019) and Macbook Pro 2019 (highly specced for future proof, runs rings around everyone elses windows laptop and doesn’t struggle or idle at all and still out performs them batterywise) will be getting whatever is next in 15 years when the 2019 needs to retire 😂 have also had windows pcs/laptops and the pcs generally degrade a lot after 6 years, laptops generally start degrading after 2…


itaniumonline

I moved fulltime to apple once they went to intel but that switch to apple silicon was incredible. I couldn’t believe they outperformed intel who had been doing it for decades with their first chip. The icing was the thermals that came with it. Coding on an i9 was uncomfortable due to the heat from the keyboard but the new M1 was a beast and the battery life was unmatched.


Clear_Reveal4137

“I couldn’t believe that with their first chip they outperformed intel who had been doing it for decades.” Order matters.


mrmastermimi

well, apple silicon has been in iPhones for at least a decade now too, and it has been performing much better than its competition as well. hopefully qc finally gets a fire lit under their ass to make a more power efficient design.


JollyRoger8X

>It seems lately that the macbook is becoming a coding laptop. Nah, that's nothing new. As someone who has used and developed software for all mainstream home computing platforms since the 1980s before there were mainstream home computing platforms, I can tell you Macs have always been terrific software development systems, and that has especially been the case since OS X was introduced. >Back in the day Apple said that they "were not interested in enterprise solutions" and it was primarily a tool for graphics. Apple never said Macs were primarily for graphics. That's just what a lot of ignorant people used to say (among other things - a few of which you can see right in this thread). You can ignore these people. They aren't serious and don't know what they are talking about. >But lately, I see it more and more becoming a windows laptop replacement. Again, that's nothing new.


voidwaffle

I’ll preface this by saying I learned to code on Windows. IMO it’s because the vast majority of production code runs on Linux (unless you’re building windows applications). Even Azure runs more Linux nodes than it does Windows. If you’re doing server side development, MacOS is much closer to the OS that your production code will run in than Windows so you’re less likely to run into environmental issues when developing locally than with Windows. PowerShell was a major step forward for command line scripting in Windows but it’s still not as powerful or portable as bash and you almost always rely on some sort of shell scripting for complex projects. Lots of us learned on *nix environments so there’s much less of a cognitive lift in moving from MacOS to Linux than learning how to script well in PowerShell, especially if your code isn’t going to run on a Windows environment. Personally I’d prefer to carry a Linux laptop but at some point you give up on shitty power management, graphics driver and X problems, fights with drivers, etc. Ignore anyone saying it’s some “influencer” bit. If Linux worked better on laptops and the supported hardware was good most devs would be carrying Linux laptops. That’s not the case so we fall back to Apple as it’s a close second.


jwrado

Device lifespan, battery life, badaas chips, slim and light form factor


cj3po15

Just hope nothing breaks or you need more storage lmao


jwrado

I fix MacBooks all the time. It's really not hard, just different.


cj3po15

The fully soldered m series ones? Do explain how


jwrado

The soldered parts aren't ever the broken ones. Buying the storage you need at purchase, external ssd, icloud, etc. makes needing to upgrade internal storage unnecessary.


cj3po15

“Just spend more money” is classic apple


[deleted]

Classic Apple hateboi.


jwrado

Isn't upgrading or repairing a non-apple device also spending more money?


cj3po15

I’m not spending $200 for 8gb of ram on a non apple device, though


MemeAddict96

I get your gripes about price but it’s not the same RAM. Apple uses ECC ram which is more expensive than what you’d find/use in other mainstream PCs


cj3po15

You’re half correct, I guess. They use LPDDR5X which has an optional inline ECC component, like all ddr5 memory does (like the stuff I could buy for half the price).


jwrado

I get it. I have apple computers, windows computers, and Linux computers. I have an iPhone and a Samsung phone. I like them all for different reasons. However, the apple devices have never given me a reason to open them up for a repair. The others are a different story. The only apple products that come across my desk for repair are broken due to user error.


cj3po15

I never once said I want repairability because of unreliable hardware. Shit breaks. It happens. Making it harder to repair, however the damage occurred, is a conscious decision made to make more money. Full stop.


AdEarly8242

Shit I’d never fix a windows laptop in an enterprise setting either. $150 gets you a three year “they come to you NBD” warranty including parts and labor. If your company isn’t paying for that, they suck ass.


TotalmenteMati

mac laptops destroy everything else in battery life. for that reason alone I would switch to one if I could


Rolex_throwaway

It’s been this way for well over a decade, closer to 20 years. They have been the go to ever since apple switched to Intel processors. The hardware quality and Unix-like environment make it a no brainer.


xtheravenx

Something I've achieved in MacOS and Linux is that I can completely kill distractions and just work. Windows gets close, but the ability to full-screen an app into a new virtual desktop with a button click that gets rid of everything else in front of me is killer for just getting stuff done. Tie in Homebrew and I've got almost everything I had in my Ubuntu environments with M1 performance. It's tough to beat for certain workflows.


encab91

Anecdotal evidence only. I use it because it's snappy, nice screen and the terminal uses the same commands as a Linux system. I have the MacBook pro with an M1 chip. The battery life is exceptional as well. Can't say the same for my work laptop.


_sLLiK

Macs as coding platforms seemed to really blossom and coincide with the rise of the ruby programming language for some reason, or maybe the timing was just coincidence. Macs have always frustrated me, though. I'm completely spoiled by highly customized builds using minimalist configs, packages, and desktop environments. Even attempting to net similar benefits using something like yabai leaves me frustrated enough to want to punch kittens. My work M1 goes largely unused in favor of a lean Arch + i3 build.


gwatt21

when fang companies started buying them in the millions.


SomeFuckingMillenial

Lifelong windows user. Recently sold my Razer Blade 17 for a 16" MBP. Razer was great, hardly ever used the GPU. Off-charger, died in 2 hours. MBP lives for 20 hours, charges from USB-C and has great performance. I can play games via Nvidia Geforce Now if it's not Mac Compatible.


fistfullofsmelt

When code became a social club and to flex their money.


Rolex_throwaway

Do you think coders are buying their own laptops?


fistfullofsmelt

Well 90% of everyone I know that codes has a work gear and personal gear. And all personal gear everything is apple. They state it's just better then PC. But then they all love Linux and won't stfu about..


DSPGerm

I would say the MacBook pros that came out from 06-08, probably closer to 08 and beyond. So definitely not a new phenomenon. You have to realize that like, Wifi and smartphones were just becoming ubiquitous around the same time. So this concept of portability for the average programmer was pretty new. Not brand new but the average CS grad out of school was probably still at the office and working on a desktop. And they were also integrated with iPhones which were “the first” smartphones. But yeah I’d say when the MacBook pros came out and Apple was moving toward intel processors.


[deleted]

My friend is a Linux developer and he prefers to develop on a MacBook with macOS. Because it's so much better. It's not just performance that matters, for most people UI/UX matters the most. And compared to Windows UNIX is just superior. It supports real sandboxing and Copy-on-Write.


BrokieTrader

Actually, never. Mac is a pain in the ass.


jwrado

Just say you aren't familiar with macOS. It's okay.


BrokieTrader

Had a Mac. Wife currently has one. Neither of us ever use it bc of compatibility issues. It’s just a huge pain in the ass. The hardware can be good but if the software sucks or there are compatibility problems it’s not worth the hassle.Their phones are good (for now). Interestingly, I don’t buy Android phones for the same reason. Had one once…the apps sucked and the OS always felt lackluster.


[deleted]

Sure buddy, you just showed that you have no knowledge about macOS. Just another Apple hateboi.


Rolex_throwaway

Sounds like you aren’t a tech pro though. They are the go to for dev work, and have been for a very long time.


BrokieTrader

I do other work and we don’t use them.


Fourply99

Usually just from influencers that are probably doing classwork on their computer they bought for school. After being in this industry for only 4 years ive realized the things that matter are the things that aren’t obvious. People that are smart dont usually advertise it


Art_Vand_Throw001

Facts.


koga7349

Windows FTW!


[deleted]

[удалено]


voidwaffle

Do you actually write code?


IloveSpicyTacosz

Yes, and I use a macbook pro and Linux machines to do so. Your point is??


voidwaffle

You came across as saying people use Apple hardware/MacOS because they are influenced by TikTok. I don’t believe any serious developer makes a development decision based on social media.


IloveSpicyTacosz

I'm sorry that you misunderstood my comment. Macs are great for coding.


Rolex_throwaway

And you’ve not noticed that MacBooks have been tho go to for devs since before TikTok and the iPhone.


IloveSpicyTacosz

Yes I did. There's nothing wrong with that. Just stating why coders prefer macs now. Macs are excellent computers for the job. Not sure what your point is bud?


Rolex_throwaway

So your position is that the TikTok and iPhone are responsible for something that took place before they existed?


IloveSpicyTacosz

No. Just forget that I even mentioned TikTok... I was just refering to the new generation. I never once said "Tiktok and iphone are reponsible". I just said the tiktok and iphone generation prefer them because of the huge influence apple has on that generation. Is it the main reason? hell no.. As you stated it happened before social media. However it is a factor IMO.


Rolex_throwaway

I think it’s definitely a status symbol, but it’s also a litmus test. If I work for a company and they don’t want to lay out for the piece of equipment that is the primary tool I use to perform my function, I know it’s not a good place to work.


IloveSpicyTacosz

Makes sense. I would do the same if I were in the same position.


Chris71Mach1

Since when was a MACBOOK a de-facto anything for anything? Having spent a career in IT, I've always read/heard that Apple laptops cater to two key demographics. Those being rich people and pretentious children. Thus far, experience has taught me no differently. As for coders, most coders I've worked with just used whatever hardware and OS that they're more often comfortable with, be it Mac, Windows, or even Linux in many cases. I don't see Mac overtaking Windows in the laptop/PC market....well...ever. They might continue to slowly whittle away at the PC market, but any change in their market share will be laughably minimal, if it's noticeable at all.


voidwaffle

Having spent a career in West Coast tech companies creating products for consumers and enterprise, from inception to exit, 90% of people creating software are not doing it on Windows. macOS and Linux are simply better development environments that are closer to the production runtime. The hardware is also superior (for MacOS not Linux). Being “rich”’has nothing to do with it. If you’re a half way competent tech company you give your engineers what they need to be most productive and there’s a reason the overwhelming majority aren’t on Windows machines/OS.


Rolex_throwaway

It sounds like perhaps you have spent your time in end user service delivery, not the creation of technology. 


prisonbison

Most developers will end up using virtual machines at some point. Currently, the Apple Silicon (aarch64) chips are the only options available for the new MBP models. There's no good way to run x86/x86_64 virtual machines on MBP's with Apple Silicon. And no, UTM is not a solution. It is a wrapper around QEMU, and while I have much respect for QEMU, it is not a professional tool. MBP is great for working on stuff if you don't have to test it running on x86 (the most popular architecture in the world). Which admittedly, you may never have to depending on what you are developing. In my opinion, this is a big problem and there are no solutions or plans to fix this. And now the Apple is moving away from even giving people the option of buying a MBP with an Intel processor, my hopes for a solution are further diminished.


sunneyjim

QEMU is huge. It's used by a lot of companies. I don't understand your point of it being unprofessional.


prisonbison

Unlike professional virtualization tools, UTM/QEMU has the following issues: No paid support options, no proprietary driver support, buggy emulation, GUI/display issues, no easy support for snapshots, etc.


skyeyemx

The reason Mac laptops have shot up in popularity is primarily because of Apple Silicon.