>>> "the realization that Apple no longer caters equally to casual and professional customers as it had in the past [YouTube video]. Instead, the company appears to be following an iOS-focused, margin-driven strategy that essentially relegates professionals to a fringe group."
Except this is largely based on mythology. People seem to equate "pro user" with "software developer" (and no other roles/occupations at all), and invent a fictitious history in which the Macbook Pro was jumping up and down on stage shouting "DEVELOPERS! DEVELOPERS! DEVELOPERS!"
Except that never happened. The perceived dev-friendliness of the Macbook Pro was a historical accident: it turns out that when you build something for use cases like audio/video engineering (which overlap a bit in their hardware needs with use cases like compiling software), and happen due to quirks of your company's history to be shipping a Unix-y operating system, developers will like it. But aside from providing tools to build applications for Apple's own platforms, the MBP and other Mac hardware wasn't deliberately catered to developers.
Meanwhile, the audio/video-engineer types still seem to like the MBP; the reviews I've read from them are positive about the touch bar and accepting of the fact that USB C is the future. It's just the developers -- and largely developers who hated Apple's products anyway and likely will never interact with the new MBP -- who say that this is proof that Apple has completely abandoned "pro" users.
>>> "Meanwhile, the audio/video-engineer types still seem to like the MBP"
I can't speak to the professional video world, but that's absolutely false in the professional audio world.
My experience is in the world of composition and sound design; the attitude towards using Mac for that shifted a while back, due to the lackluster and non-existent upgrades to the Mac Pro.
No one in the professional audio world that I know, or anything that I have read, has suggested they are liking the new laptop (which wouldn't get much use from them anyway).
>I can't speak to the professional video world, but that's absolutely false in the professional audio world.
Depends on what you mean "professional audio world". If you mean heavy old style studios, the kind Led Zeppelin might have recorded on back in the day, yes, but these are on the go (profits and usage wise) anyway, as the industry shifts.
Musicians, producers, DJs, etc most carry laptops and have home studios based around them, and most use MacBook Pro's for their stuff (as evident in all kinds of interviews and live scenarios).
I was referring to the composition and post production (sound design, engineering) world. Mostly because that is the industry I worked in for many years and still continue to work in (much more sporadically these days).
That's a world that was dominated by Mac Pros around 2010. Around 2013 I started seeing a shift, myself included, to custom built PC workstations and that trend is just increasing now. The initial switch, I believe, started with the lackluster cylinder Mac Pro, but continued due to the obvious failings in the Mac desktop market.
You speak of home based studios using MacBook pros, but anyone doing that is obviously not a professional. I will give you the fact that many DJs are using MacBooks for the mobile rigs, but at home, anyone actually doing professional audio work is likely using a massive powerful workstation or a number of PCs (master/slaves).
This notion that MacBooks (or even laptops in general) are super popular in the professional audio world is fiction made for advertising.
>You speak of home based studios using MacBook pros, but anyone doing that is obviously not a professional.
Tons of musicians/producers/etc have those, while making more money than professional studios from their productions -- and not just in EDM.
Lots of the work that studios did for even superstar musicians (pop, etc), nowadays happens in the box, and not just demos and early sketches.
>This notion that MacBooks (or even laptops in general) are super popular in the professional audio world is fiction made for advertising.
Rather the professional audio world is not what it used to be.
I'd consider million-making Bjork or whatever working on their laptop, as equally (or more) professional than some struggling studio or post-processing facility.
That's more a case of the MacBook Pro being "good enough" and Apple, and PC manufacturers too for that matter, have successfully segmented the market (with things like ports). Even Macworld will tell you these days that "whether you choose a Mac or PC for music production is largely down to the platform you prefer and who you’re collaborating with. There’s little inherent advantage to using Macs, beyond familiarity with the system, and the general robustness of the hardware" [1].
What would be the alternative anyway? Imagine a DJ playing a set and suddenly having Windows force an update on them. Linux during some periods barely played audio for regular use cases.
If you want good hardware and a reliable operating system, you go with a Mac. A few annoyances with adapters it might require won't change that at all.
Anecdotally the surprising answer might be rather old BeOS based systems. At least that's what I saw once and apparently some people still use some old tools that are BeOS only for audio (from talking to the DJ) :D
I got the strong impression that those self-proclaimed "professionals" are more or less exactly the opposite: people who spend an inordinate amount of time plugging in dozens of different peripherals with random port requirements, fiddling around with the hardware, and running benchmarks.
For what it's worth, a Mac today is still as useful (at least for web development) as it has always been. Much more so actually because homebrew has evolved to be completely stable.
And while CPU speeds have been stagnating (mostly because they've reached "good enough" and then some), people are ignoring two major shifts in hardware that have significantly improved the dev experience: Retina screens and SSDs.
The absence of SD card slot and general lack of periphery ports is universally lamented by AV professionals. There are also reports of poor battery life although I suspect it is a side effect of a brighter backlight.
No that is not mythology, Apple specifically targeted developers in the past, such as in this 2002 commercial featuring a stereotypical grey-bearded UNIX guru:
Quote: "It's not just for artists. It's not just for pointy-headed intellectuals from California. It's for us, it's for programmers, it's for developers."
That was then. But now that Apple does know that the MBP is perceived as dev-friendly, based on the type of people in that market segment that buy them, why would they let that slice of the market go?
True. But they do seem to have created a niche in the market for someone else to fill, due also to their high markups. Most "developers" don't actually need Macs, many of them just haven't realized it yet.
The perception, as displayed in this thread, is that they are. Plus the overwhelming negativity from developers regarding the new MBP, as seen in numerous HN threads and other tech news stories.
I don't really have any stock in whether the perception is a reality or not, so I'm willing to be persuaded. Hopefully in the next iteration the MBP will have at least 32GB RAM, and they will make a move that will please the software developer market.
>I don't really have any stock in whether the perception is a reality or not, so I'm willing to be persuaded. Hopefully in the next iteration the MBP will have at least 32GB RAM,
The people lamenting the missing option for 32GB RAM as devastating for pros, never had it in the first place, as the previous generation didn't support 32GB either (and yet, they still worked just fine).
Second, the limitation to 16GB is from Intel, not Apple. Intel didn't have 32GB compatible modules with low energy consumption ready -- and without those, Apple using the energy sucking 32GB option would reduce battery life 20% or more.
Not to mention that the overwhelming majority of PC laptops developers buy don't have 32GB at this point in time either, including the most famous models (XPS etc).
Well people who are lamenting the lack of the 32GB ram bought a 16GB ram a while ago and are looking to upgrade.
Apple could have kept the previous form factor and not made the battery 20% smaller which would have allowed them to have an energy sucking 32GB option. Plus the new form factor is useless anyway, since we end up having to carry adapters which take as much space and weight as what was saved.
Anyway, I've developed a wait and see approach and reserve judgement next year when the next iteration of machines come out. Maybe I'm lucky and Apple even decides to bring back the 17 inch macbook pro
>Apple could have kept the previous form factor and not made the battery 20% smaller which would have allowed them to have an energy sucking 32GB option.
More people like thinness though, than 32GB RAM (which would have been a built-to-order option that few would have clicked). Remember, Apple has those numbers too.
To quote a programmer on thinness:
"I’m have to admit being a bit baffled by how nobody else seems to have done what Apple did with the Macbook Air – even several years after the first release, the other notebook vendors continue to push those ugly and clunky things. Yes, there are vendors that have tried to emulate it, but usually pretty badly. I don’t think I’m unusual in preferring my laptop to be thin and light. (...) I’m personally just hoping that I’m ahead of the curve in my strict requirement for “small and silent”. It’s not just laptops, btw – Intel sometimes gives me pre-release hardware, and the people inside Intel I work with have learnt that being whisper-quiet is one of my primary requirements for desktops too. I am sometimes surprised at what leaf-blowers some people seem to put up with under their desks. (...) I want my office to be quiet. The loudest thing in the room – by far – should be the occasional purring of the cat. And when I travel, I want to travel light. A notebook that weighs more than a kilo is simply not a good thing (yeah, I’m using the smaller 11″ macbook air, and I think weight could still be improved on, but at least it’s very close to the magical 1kg limit)"
Iirc, Intel shipped custom CPUs to Apple for the Macbook Air. Once those CPUs became more generally available other brands followed suit quite quickly.
This has been a repeating pattern with Apple since the iPod success fattened their wallet. They would routinely single out parts they wanted, and then basically order the factory capacity for that year in one go.
Palm/HP execs lamented this fact when they unveiled the Pre 3 and Touchpad, as they often wanted to use the same parts but found Apple had gotten there before them.
A rev of the original MacBook Air (the 2008-2010 one) got a custom package of an Intel CPU before it was available commercially.
The original MBA was expensive and underpowered, a niche product at the time. PC vendors responded with equally niche designs (remember the Dell Adamo?)
The Late 2010 MacBook Air introduced the current case design at a reasonable price, used 2+ year old internals (Core 2 Duo + 330m graphics), and set the world on fire.
Well, considered it went top selling, got huge profits, was mimicked by almost every other manufacturer, and Intel even gave money to OEMs to make a competitive PC laptop, I don't see the sarcasm as warranted.
>Palm/HP execs lamented this fact when they unveiled the Pre 3 and Touchpad, as they often wanted to use the same parts but found Apple had gotten there before them.
Isn't that how economies of scale and bulk ordering is supposed to work?
> I don't really have any stock in whether the perception is a reality or not, so I'm willing to be persuaded.
Apple said it was their best laptop launch by sales volume ever. (Although that is partially due to such a long wait from their previous model.) Honestly I'm not that surprised - despite all the hate online I know 2-3 developers who bought them on or near launch day and several others who started salivating when mine arrived. The public internet opinion seems shockingly disconnected from that of everyone I've spoken to in person about it. Although, doing mostly web development more than 16 gigs just isn't important for me or my coworkers. My last laptop had 8 and it was more than fine.
Most people are much more curious about the touchbar (its a bit gimmicky), the screen (gorgeous but no touch-based scroll in browsers sucks) and the price (yowch!). I get a few comments both ways about the new charger (oh cool no more proprietary connecters!) (whaaat no magsafe? Boo!).
Oh, and being a Gen1 apple product there's all sorts of dumb stability problems - which is a much bigger problem than the ram! My touchbar has been glitching out sometimes and the whole system has hard-crashed a half a dozen times since getting it from connecting and disconnecting my 4k monitor. I'll be very disappointed if the software doesn't improve over the next few months.
The constant crashing, touchbar bugs (a few times it didn't even turn on!), graphical corruption, obvious cheapskating (no extension cable, non-data charging USB cable, bad keyboard backlight, no more cable holders on power brick and more) are the larger disappointments with this machine :/
Given those issues, I'm wondering if returning the laptop for a new one might be in order? For what it's worth, I've been using the new MacBook Pro with the touch bar for a few weeks now, and have only experienced one hard freeze.
Contrast that with the steady stream of issues I experienced with a similarly specced high-end Dell XPS 15"--power cord whine, wobbly and noisy fan on the graphics card, severe overheating to the point that the BIOS trigged shutdowns, repeated keys (an ongoing issue affecting multiple Dell models based on support forums), suspend/resume failures, laggy touchpad, and numerous issues with Wifi & sound across several distributions of linux--the new MacBook Pro has been a delight. To Dell's credit, they've released BIOS updates on a regular basis, but the laptop is still plagued with annoying issues, even on Windows.
What I don't get from the article, if I'm pissed about the hardware change why am I trying to run Linux on it/my old one?
A) You keep your old laptop and continue to run macOS for as long as possible
B) You're shopping for a new laptop and don't buy Apple, then run Linux
If you keep your old hardware why run Linux on it? I've been looking at who else is out there are making decent laptops that run Linux well (open to suggestions, especially if they have more than 16GB RAM), as I have a feeling my 2015 Retina will be my last MBP.
The hard part for me is that I've gotten used to a certain look and feel with the MBP. That nice solid, rigid body, and the display hinge that just stays. When I pick up a ThinkPad and it creaks and flexes when I open the display or casually pick it up by the corner it makes my skin crawl.
Have you taken a look at the Razer Blade? It is basically equivalent to the Macbook pro's hardware in every way, except black, an i7, a touchscreen, and an NVIDIA GeForce GTX 1060, and a bit less expensive. I just got one. I haven't put Linux on it yet, but I'm pretty blown away with the hardware.
Razer's basically a "gamer gear" company. Think stylized light-up keyboards, mice, headsets, and so forth. Problem is, their quality control is really all over the place and they are ridiculously expensive. This weird QC is part of the reason I never bothered with their laptops despite the well specced hardware.
There was an article last week stating they don't support Linux out of the box (or at least not Ubuntu), but I don't remember details regarding how much work is required to get it running.
The Arch wiki has a whole load of details on getting one up and running[1]. These instruction will usually translate between any other (up to date) distro with a bit of translation.
Their older models from 2015 or so with 970m GPUs won't boot due consistently due to the hybrid (Optimus) graphics. The situation may have improved with the newer models, though, and their Stealth line only has Intel graphics which should work fine.
I cannot recommend Lenovo X1 Carbons enough. I use it at work. Runs Linux buttery smooth. (Tested Ubuntu and Arch, both work perfectly, Arch needs a few minor tweaks)
Apart from being wrong, that narrative doesn't even make sense as an intro to this article: the criticism was directed at the new MBP hardware – so why would these super-professional uber-users buy it and then replace the OS?
(Not to mention that, if I understood it correctly, Linux currently doesn't even support the keyboard, and is otherwise probably in its usual desktop mode of "WLAN, Sound, or functioning sleep – choose any two".
Author here. As explained in the intro, folks like antirez are considering moving back to Linux and in his linked HN comment he specifically talks about trying Linux on the MacBook he's currently using to see where he's headed. The article seeks to give these folks in particular a helping hand in understanding what works and what doesn't:
That said, the kernel developers working on Mac hardware support aren't Apple haters. We're fans. I also don't share the "not for professionals" criticism that was widely directed at the new MacBook Pro fully: E.g. as stated in the article, the four Thunderbolt ports are interesting for HPC applications as you can build a portable compute cluster with up to five fully-meshed high-speed networked nodes. If that's not professional I don't know what is.
It's experimental though. Better back up your data.
I'd recommend to create a partition for ZFS and move data there that you want to use on both OSes, or outright install Linux on a zpool and mount that on macOS. This has worked remarkably well for me, even with lz4 compression and all the other great features. Layer ZFS on TrueCrypt if you need encryption, I wrote a howto a while ago but never had the time to update it:
maybe, but not all pros are computer science pros, e.g. photo and videographers for example are pretty pissed that the sdcard reader was removed as far as I know... but then again that's solvable, ram limits far less (3d artists might like it?)
That's right. But it seemed for a while that the MacBook Pro had the feature set, hardware configurations and broad appeal that allowed both groups to be aligned on their hardware choices (i.e. a MacBook Pro).
It is perhaps too early to wave goodbye once and for all to that coalition, but it does seem like some of the design directions that Apple has taken have drifted away from that user base.
That said, this war on the professional and creative class of users began sometime ago - I think the introduction of Final Cut X was the first salvo, while the platform has significantly evolved from the first couple of problematic releases, I personally know some who considered the product a literal betrayal and abandoned ship to Premiere Pro or AVID.
I sense Microsoft is attempting to reassemble that coalition with their offerings, namely the Surface line (especially that gorgeous looking Surface Studio) and the introduction of Bash on Windows, which while still quite limited in its functionality, can be a seen as a step.
I really like my MBP 2013 (NVIDIA GPU) and I guess I am going to keep it a while longer.
> E.g. as stated in the article, the four Thunderbolt ports are interesting for HPC applications as you can build a portable compute cluster with up to five fully-meshed high-speed networked nodes. If that's not professional I don't know what is.
Can you elaborate on what you mean by that? Are you implying a MacBook cluster is an effective HPC platform?
Let's say you're a penetration tester, you're on-site trying to break into a network, have no Internet connectivity (e.g. because you're in a shielded data center or don't want to raise suspicion) and need to crack some passwords. 5x Skylake CPU + 5x Polaris GPU would be a sufficiently beefy platform for Hashcat, and it would fit into your backpack.
Or let's say you want to quickly set up a Hadoop cluster to crunch some data.
Five machines is certainly small as HPC clusters go, and in reality you'll probably have to cap this to four machines because you need one port to attach to a power socket if you need the cluster for more than 2-3 hours. Nevertheless the 4x 40 GBit/s Thunderbolt ports offer some really interesting possibilities, and I'm not aware of any other portable machine that has this right now.
There's 4 40Gbit/s ports, but their total bandwidth is not 160Gbit/s. IIRC, the ports on the right side of the machine have less total bandwidth because of how they're connected, for example.
That is correct. The Thunderbolt controller has two DisplayPort sink ports which are wired to the GPU, in addition to the PCIe 3.0 4x interface which goes to a root port in the CPU. So to actually max out the 40 Gbit/s, you need to drive external displays and also max out the PCIe 3.0 4x interface. Which is another way to say that 8 Gbit/s are reserved for DisplayPort.
Yes, but as stated in the article only on macOS or Linux on non-Macs. Not with Linux on Macs unfortunately. Someone needs to scrutinize the Intel-developed code for non-Macs, figure out how to port it to Andreas Noever's driver, and fill in the missing parts by reversing the macOS driver.
Also, the Alpine Ridge controllers are attached with PCIe 3.0 4x lanes, that's 31.504 GBit/s, so you can't exhaust the available bandwidth fully.
The article talked about how Linux supports several models, not just the latest one. If I owned an older Macbook but now regarded the product line to be doomed, I'd want to know how painful it would be to start migrating to a different platform on my current hardware.
Although I have experienced that choice on Apple hardware recently (2015 mbp retina), lately Linux seems to, let's say, 95% Just Work on non-apple x86 hardware for me, WLAN, sound, and suspend/resume.
[edit] and this is with any distro, perhaps with more or less setup work
I find sleep/resume is successful when the proprietary driver is installed, nouveau isn't doing it doing it for me on Ubuntu derived 16.x distros (this month I have tried KDE Neon and Bodhi). The problem though, is that installing the drivers can be a pain. Last time I tried I soft bricked the PC and ended up doing a full reinstall to save time. As someone further down this thread said, what advantage is there these days in Nvidia not open-sourcing the drivers? They are still going to be selling the display cards.
The other problem for Linux I have encountered recently that burned through my free time was updateing the bios, there are no bios updates with Linux installers, which is annoying because the bios is update is OS agnostic. I had to create a bootable USB drive from a windows program (luckily I have a Windows PC for work) and boot from that. But the how-to guide from HP was lacking in details, and on one PC the program doesn't allow me to create the drive on one PC for use on another, so no bios update there.
The problems I have with Linux never seem to be to do with Linux, it is almost always hardware manufacturers being obstinate and refusing to share their toys with other adults.
Yeah in the future, I will certainly be buying hardware that is designed to run desktop/server Linux (and support features you mention like BIOS upgrades easily as well) or is otherwise known to support it well. No more random COTS buys for me without research first. I'm done burning hours on this sort of weird crap vendors want to give us. So if a vendor wants to sell me a PC or enterprise anything, it needs to support Linux well.
I also don't plan to purchase any brand new Apple hardware. I will only adopt Apple hardware late (and used) and will only run OS X if absolutely necessary for some reason.
> what advantage is there these days in Nvidia not open-sourcing the drivers?
AMD have demonstrated that trying to do the right thing will get you shit from the maintainers and abuse from bystanders, who will then go and buy Nvidia because it gets a better frame rate.
AMD's been in the news for trying to half-ass the right thing, and asking the community to lower their standards because AMD is too poor to do the job right.
I never understand this sentiment: AMD is still putting in effort for a token market (Linux is something like 0.3% of computers if you don't count servers/embedded). Would you rather have them do nothing and leave the cards poorly supported?
They're putting in token effort and trying to create an impediment for Linux kernel development even on other hardware. And they were warned months ago that it wasn't going to fly.
Having a driver is good, but putting it in-kernel is bad if the kernel devs can't read it and fix it.
This is a really great article I should add and I do wish them luck in their endeavors to support Apple hardware. The developers tone is quite inspirational.
I wouldn't necessarily blame the "WLAN, sound, or sleep" thing on linux. I'd blame that on cheap parts manufacturers that refuse to distribute open source drivers for fear of releasing company secrets and lack of profit margins to operate on.
>I wouldn't necessarily blame the "WLAN, sound, or sleep" thing on linux. I'd blame that on cheap parts manufacturers that refuse to distribute open source drivers for fear of releasing company secrets and lack of profit margins to operate on.
I don't understand why would anyone not open-source their drivers?
What's NVIDIA's secret sauce? Being able to build great silicon.
What's Intel's secret sauce? Being able to build great silicon.
What's Samsung's secret sauce? Being able to put together great silicon.
So let's say tomorrow Nvidia open-sources all their drivers?
So now you know that which registers do you use to shade what. So now what's AMD going to do? Copy the interface?
It's already copied. Most users use standard OpenGL/DirectX to communicate with their board.
I mean look at Intel. You practically know what's what there. You can make an OS on (at least older) Intel chips without drivers.
What's NVIDIA's secret sauce? Being able to build great silicon.
NVidia's defining characteristic has been designing good drivers.
Even when they've been behind ATI/AMD in hardware, they've been ahead in driver stability and performance. Especially on Linux where they largely own the CG/post/design market (which mostly uses Linux workstations) and they are the only GPU vendor that has somehow squeezed Windows-competitive performance out of Xorg.
> NVidia's defining characteristic has been designing good drivers.
This. When shopping for discrete video cards, I no longer look at AMD's offerings because I'm not willing to fight with their driver, fret over whether or not I should apply updates, etc.
So basically, most 3D applications and games are broken to a certain degree. Both NVIDIA and AMD invested quite a lot resources into making those applications run at acceptable level and not looking broken.
Now, when you figure out how to unbroke something and put it into fast path, your competitor can lift that easily into their driver. Neither of these players want that.
The good news is, that DX12/Vulkan/Metal have quite a different approach, they don't have complicated state machine and validation in the driver, making the driver simpler and these workarounds unnecessary. If everything goes well, we may see drivers for these stacks.
Modern graphic card drivers contain full compilers, for the GPU's shading language. It seems reasonable that you can extract performance with better compiler technology.
On the other hand it also seems reasonable that a compiler for e.g. NVIDIA's hardware might be hard to re-target to another vendor's hardware while retaining the same advantage.
As I understand things (which I'm clearly not an expert at all), CPU opcode is like an ABI,
put this value into register A,
this value into register B,
and a magic opcode into register C,
and poof, the register D contains the value of A+B.
A driver will say that you put int X into memory point A,
Y into memory point A+1, and call function C with argument of A and D will contain the multiplied number.
Let's say your driver exports an API that computes f(x,y,z) where the math turns out such that g(h(x,y),i(y,z)) is an easier and/or numerically better way to compute it.
Let's also say your hardware implements g, h and i, but doesn't wire them together yet. So, your driver's code calls the three functions.
Unknown to you, the trick to use that decomposition is patented. if you release your driver's source code, you make it easier for the patent holder to figure out that you are violating their patent (your programmers might even have made it patently obvious by mentioning the paper describing the trick in a comment)
Unlikely? Maybe, but the way patents are written, who knows? For an almost random example, I searched for "driver code and hardware patents". The first link I clicked was https://www.google.ch/patents/US20090006831. Reading that, I wouldn't know what driver would _not_ infringe on it.
Also, there are many patents on ways to move data around efficiently. Avoiding all of them while still writing a performant driver at not be possible.
"Blame" might be the wrong word since nobody involved with Linux has committed any moral failing here. It can certainly be analyzed as a drawback of the open-source model.
Microsoft has a lot of resources to incentivize hardware manufacturers to go to great lengths to make sure their drivers work well with Windows. No organization connected with "Desktop Linux" has that.
And don't want to reveal the number of ugly hacks they have implemented to get things working in the first place.
the reason Linux drivers are a "mess" is that the hardware they are trying to support are a mess, and said mess is papered over in Windows by the OEM drivers.
If you just realized this year that they are entertainment focused and devs/pros are a secondary concern, you haven't been paying attention. Since 10.6 when dual full screen was removed I say the shark has been jumped.
Ouch