After Nilay Patel’s strong piece and John Gruber’s meager response, here’s another one by Steve Streza:
John can argue all he wants that this is all somehow in the best interest of customers by virtue of it being great business for Apple, but it simply isn’t true. It also won’t be a hill that many customers will die on at the point of sale. People will not buy into Lightning headphones, they will put up with it. This transition will be painful and difficult because of just how thoroughly entrenched the current solution is, how little the new solution offers, and how many complications it adds for customers. Nilay is correct, it is user-hostile, and it is stupid.
But hey, it’s great for Apple.
I have very little to add here, other than dongle, and a plea: can somebody finally give me a valid reason for removing the 3.5mm jack? I’ve heard nonsense about waterproofing (can be done just fine with 3.5mm jack), battery life (negligible, unlikely because of the location of the assembly, entirely and utterly eclipsed by making the battery like 0.5mm thicker), cost (…seriously? That’s the best you can do?), or thinness (oh come on, the iPhone 6S is 7.1mm thick – it will take a miracle for the iPhone 7 or even 8 or 9 to be thinner than 3.5mm).
Anyone?
As far as I can tell, there are only downsides.
but hey, it’s apple. so it’s fabulous by definition.
Isn’t it?
Fabulous? I thought everything Apple did was “magical“…
i think “amazing” is the word you are looking for.
“can somebody finally give me a valid reason for removing the 3.5mm jack?”
Yeah, lightning port are reversible !
Oh, wait…
When it’s reversible, how can you tell if it’s an upside or a downside? 🙂
All Apple products are always upside, because Apple products have no downside, right ? 😉
…we all know the real reason for the Lightening Headphones is RIAA, MPAA and content control.
Everything else is just an excuse and misdirection while they try to take more away from you.
There said it.
Yeah, and an excuse for various licensing agencies to take a cut of headphone sales. Nowe we’ll have lightning headphones, USB C headphones, 3.5 mm headphones… I don’t own nor do I want that many sets of goddamn headphones! Screw this! I’ll just go Bluetooth and deal with it at this rate, because I don’t have the time to fish around for the headphones of the hour!
Be aware that Bluetooth adds a layer of lossy compression to your audio which makes it perceptibly worse. I don’t like that tradeoff.
I know. I’ll deal with that rather than needing three separate fscking pairs of headphones when I only need one now. I have enough damned devices I need to cart around for work as it is.
I agree that having multiple kinds of mutually incompatible sockets and connectors for speakers or headphones is silly, but let’s just hope it doesn’t come to that.
If you compare ‘just 3,5mm’ to BT, then I’d prefer to use the wired connection, because quality. Also maybe because I haven’t ever used a pair of BT headphones, so I might be biased towards wired headphones, but still.
Some people prefer convenience over quality. 🙁
I agree, no way am I going to buy a pair of lightning earphones, especially when that plug is basically EOL. I just bought a pair of Bower & Wilkins P5 Bluetooth headphones and they sound really good. No where as good as my BeyerDynamics D700’s but honestly not much is.
Do you mean the DT770? That is one of the headphones on my shortlist (together with the Sennheiser HD380 Pro), but because the Beyers only cost slightly more than €100 (or around $150) they are far from the best headphones out there.
Since the source audio is also probably heavily compressed MP3’s, it’s unlikely to make it any worse– and since the better quality headsets already speak mp3/mp4/aac over the bluetooth link, I suspect there is no second layer of compression going on.
Good point!
This triggers me to do two things:
* find out what impact Bluetooth compression has on audio quality
* check which devices support unconverted audio streaming over Bluetooth
no such thing as real uncompressed audio over bluetooth. i don’t think the bandwidth size and latency times are up to it.
no substitute for the wires to the speakers, not yet, in most cases.
and yes i listen to bluetooth speakers sometimes and of course can enjoy myself just fine. but wired ones will always sound better.
No, there is not such scarcity as you describe. There are a bunch (not all) of Bluetooth devices that have good drivers and permit FLAC with all of 10s. queue lag. Bluetooth 5 will (start to) permit 4x the data rate (and range) in Sept. 2016 or so, making it more common.
Lossless 6-channel 24/128k…yeah, a little farther out.
Well, yes, and no. Bluetooth 1.2, without a2dp, you’re absolutely correct. Bluetooth 2.0 kicked the bandwidth up to 3mb/s, and Bluetooth 3.0 leaped up to 24mb/s. Apparently, most people’s knowledge of bluetooth froze in 2005.
So armed with just a Bluetooth 2.1 headset and a fairly recent a2dp profile, you shouldn’t have any problems, since a2dp can (but isn’t required to) support direct transfer of MP3/AAC in native format, and BT 2.1 will easily handle 2.1mb/s.
Either way, with a minimum bandwidth of 768kbps, I suspect there’s more than enough room for a 192kb/s stream, or even a 320kb/s stream, since they’re being transferred in their compressed form.
The only problem with Bluetooth 2.1’s a2dp implementation is the lag between transmission and playback. It’s noticeable, since it’s about a quarter second. Bluetooth 3.0 +HS went a long way toward fixing that however, and 4.0 seems to have finally done away with it for good.
I don’t know much about bluetooth, true.
For hi-res audio you are going to need to get 8Mbs to each speaker driver and in perfect sync with each other. The analog signal travels down a wire at nearly the speed of light, so the latency is minimal and the Left and Right channels stay in sync.
I don’t even mind the audio trade off. My biggest issue with Bluetooth headphones is that it now becomes yet another thing I have to keep charged in addition to who’s battery I can be almost certain of won’t last the duration I likely want to listen to music for (several hours).
Edited 2016-06-23 13:52 UTC
CodeMonkey,
Yes exactly, this to me is the biggest drawback. The same thing applies to keyboards and mice. I agree the wires are unsightly, but having to replace batteries sucks. Even rechargeable models need corded chargers and cradles anyways, which can be as bulky as the wired devices they replace. You never know if they’re going to work for as long as you need them to.
That is why I bought the Logitech K750 solar powered keyboard. No cable and no battery hassle. I’m still very happy with it.
Depends. I’ve got the LG Tone HBS910, and their battery is rated at about 17 hours. In practice I get about 14 or so, but I’d bet that’s longer than you’d want to listen to music at any one time. The Avantree Audition also have good battery life if you prefer a traditional closed headphone, but I didn’t like them because the microphone was, to put it mildly, god awful.
… how? I mean, are people doing analog ripping through their headphones right now? Most MP3’s no longer have any DRM on them, the FLAC’s that I create sure as hell don’t have any DRM, most podcasts are MP3’s (see above).
Sure, you won’t be able to rip that Dolby DTS-HD or TrueHD via your two-channel headphones, but if you were doing that, you’re an idiot anyway.
So, what DRM are we talking about? What content will you not be able to listen to on your lightning-based headphones that you can currently listen to on your Sony Walkman technology?
Now, conceivably, Apple could say “only these headphones will work”, but the backlash will be pretty brutal, and Apple’s just Going to Lose on that front.
Lightning based headphones appear to start around $20 USD, and go up, which makes it only a bit more than what people will spend on analog headphones, so I don’t see the cost issue.
Personally, I don’t care– I haven’t owned a set of 3.5mm headphones for over a decade, because for me at least, Bluetooth is way more useful. My only irritation is that USB-C is a much nicer port– but I’m also unlikely to own an i<anything> anytime soon.
MQA is the next great thing coming to lossless music. It requires a form of DRM. No doubt Apple will jump on the bandwagon at some point.
http://www.mqa.co.uk
http://www.digitalmusicnews.com/2016/02/08/tidal-teases-a-massive-m…
Also, you might have forgotten that Apple purchased Beats, so (putting my tinfoil hat on) they could theoretically DRM their headphone connections to work only where they want. OR only playback the highest quality audio on whatever device they want.
No thanks!
In other words, snake oil. The “How it works” pages has no real info on how it works, but plenty of links to “How to get it”. This reeks of con-artist.
Again, for the digital music illiterate:
https://xiph.org/~xiphmont/demo/neil-young.html
Edited 2016-06-23 17:13 UTC
MQA is not lossless. It’s a new version of the lossy codec but it’s in the encoding, not after the encoding.
It very well might be better than what frauhaufer wrote with the mp3 and mp4 specs but it’s still lossy.
They (meridian) are just claiming that they understand better what to lose than anyone from the 90’s did.
And they are looking to sell a shit-ton of DAC’s.
Indeed. Now we are seeing why digital connections are being pushed. There is money to be made.
Edited 2016-06-23 21:42 UTC
Neil Young was absolutely right.
Digital being a different beast, any transform, even for the ‘better’ causing great amounts of damage.
Apple would be committing suicide in a market where it’s already in a precarious position. So we’ll ignore that use of DRM.
So few words need a REAL explanation.
The DRM argument ignores all of recent history. You have to go back to 1999 to see things through that lens.
Okay, a valid reason – the 3.5mm jack is a fragile break point for any device. The cables that plug in there are ones that fail the most often in my house hold, and sometimes they crack the case.
Yeah it’s an old design, so there are a lot of peripherals that work with it. That does’t mean it can’t be improved. It also doesn’t mean it shouldn’t be improved.
There’s a lot of whining about change on this site these days. This is supposed to be a tech site. Tech is about change. Deal with it.
I keep hearing people say this, but yet I’ve never known anyone that has broken a 3.5mm jack over the last 30+ years.
Even my kids (5 and under) have more problems with USB plugs and breaking them than they do the 3.5mm jacks – they have not broken a 3.5mm jack, but they have broken plugs of the USB nature.
So I really find the durability argument of the 3.5mm jack to be a straw man argument, and the people that break them are probably the same ones that use to use a CD tray for a coffee mug holder.
OTOH, every time something turns digital the MPAA and RIAA are close behind trying to make it protected so that someone can’t just use it to capture movies or music over it. (See HDMI/HCMI, DTV, etc…lots and lots of examples in the last 15 years.) And Apple likes helping out the MPAA and RIAA.
If you give me a physical address, I can send you all the weared-out TRS sockets that I’ve had to replace in PC cases from random students.
Symptoms:
* The switch in the socket is permanently stuck in headphone mode, as if there was a headphone plug inside, so no audio is heard out of the speaker output.
* Dust stuck in the socket prevents trigger of the switch, so audio still goes to the speakers even when a plug is plugged in.
* Only one channel is heard (dust, incompatible headset — yes, this exists, rust)
* No audio is heard until you figgle/rotate the plug (dust, weared out spring, rust).
* Minor connector movement causes audible noise (weared out spring)
* Echo between both channels (dust).
* Someone has inadvertently combined gold and copper plug/socket (I blame the stupid audiophile-friendly market where even cheap headphones now ship with gold-plated connectors) for years and there’s rust all over the place, resulting in attenuated audio.
On the other hand:
– I’ve seen plenty of broken USB _plugs_ , which is the designed failure condition for the USB port (when you e.g. trip over it), not sockets.
– The only failure type for a USB socket that I’ve ever seen is _the entire socket assembly_ separating from the PCB (because of poor soldier work).
– I’ve been able to successfully establish a USB session on the most dusty of sockets I’ve ever seen, and there were no visible performance problems.
There’s just no color.. USB is more reliable, more resistant, includes error-correction (so there’s no unwanted noise, or echo, or “one-channel-only” issues). On the other hand, it is more complex and more expensive.
For me, the benefits greatly outweight the conns.
Edited 2016-06-25 21:10 UTC
Most of what you described for the TRS issues are easily cleanable – outside RUST – and most typically due to bad or broken headphones, not the socket itself.
Until you consider just how wide-spread the use of TRS jacks are…USB just doesn’t compare.
TRS is simple in its design. You don’t have to fidget with it or get it in a given orientation for it to work. It’s used from everything from power plugs (yep seen them) to cheap electronic headphones and microphones at 3.5mm (or smaller) to switch board sets of all kinds – from cheap electronics to heavy duty gear, from your media player (ipod, etc) to airlines to military. It’s also been around for >100 years now (known traces of the TRS go back to 1878 for the 1/4″; 1/8″/3.5mm goes back primarily to the 1960’s). No microchips required – it’s pure analog.
USB has no where near that durability or coverage. It’s also a pain as you have to get the orientation of the connector correct to match the trapezoid of the one type, or get the fill-side oriented correct for the other. USB as a standard is barely 20 years old (1996) and requires the use of microchips to operate.
Yes, USB is more complex. However, “requires microchips to operate” as a con in 2016…
Edited 2016-06-27 11:19 UTC
Unless you’re solely using a USB port for power, you need a microchip to do anything. It doesn’t deal in analog signals, and has a very specific protocol that must be followed. When USB first came out (1.0, 1.1) everything was very lenient in the protocol implementation; but with newer versions they have become quite strict even to the point of older devices no longer working.
And yes, I’ve helped with that functionality. My previous employer had a device that worked just fine in with USB 1.x and 2.x; but failed completely with USB 3.x because of the shortcuts they took with their firmware support. Unfortunately we were in no position to fix the shortcuts so the device could work with a USB 3.x port, and even more unfortunately any computer that had the USB 3.x port essentially required the details of 3.x against the older 1.x/2.x ports too. So you basically couldn’t use newer computers with said device.
Yeah, USB has some nice aspects to it, but replacing TRS is not one of them.
Seems to me that what we’ve got here are two people who each like their own connector, and refute one another with anecdotal evidence. I don’t think either of you are going to win this one. Fwiw, I’ve had my share of problems with TRS connectors over the years. I’ve also had problems with USB. I don’t much care which connector we ultimately use, so long as it’s standard. In the end, any wired connector has benefits and drawbacks and, as they are a physical connection by nature, is subsequent to breakage of one kind or another.
Agreed. Though I’m more in the camp of “if it ain’t broke, don’t fix it/bork it over”
I think the main issue with the whole Convert TRS->USB argument is that TRS has been standard for so long that there’s already a ton of stuff out there that just works, and USB doesn’t offer any real advantage over TRS.
So TRS->USB is kind of like the whole DVD->BD-DVD issue.
Are there some benefits? Sure
Will the end-user see those benefits? Not likely.
Is it going to cost more? You bet.
So why do it? B/c it makes more money for the companies involved.
How many rolls of tinfoil did it take you to make your hat?
It’s outrageous the Apple wouldn’t justify the decision it hasn’t announced yet! Nobody is saying there are no downsides. But to complain this much about a design decision before the company making it has even announced it?
Well, the sooner they get the feedback the better, isn’t it?
Somewhat certainly. However design is all about trade-offs. Apple certainly knows this. And wether you would agree or not with their decisions on trade-offs they are usually very easy to understand once they show the benefits behind a decision.
Apple certainly chooses to error on the side of too-early transitions. But that is how they ensure they don’t end up being too late. The removal of floppy drives, CD drives, VGA ports, DVI ports, 30 pin adapter, etc, etc, etc were all removed before the larger industry was ready.
Frankly, our industry needs someone willing to push for “hard” transitions that force movement.
At the same time, they’re also the company that uses proprietary charging connectors and revises their docking interfaces to lock out unlicensed accessories.
Completely agree that Apple should open up for third-party power adapters and docking cables. At the same time they have just released the new MacBook with USB-C charger (assuming the definition of open/standard includes the USB-IF) and the lack of MagSafe will prevent me from buying it. (At least until I don’t have pets or kids!)
tantalic,
Sure, the need for physical media is inversely proportional to the affordability and availability of broadband networks. Over the years we’ve progressed away from the sneaker-net and towards networking. Companies like Sun were pushing this early on. If you recall, they had the slogan “The Network IS the Computer”. And sure enough in all my years at college I never once needed to use sneakernet for my CS work almost two decades ago. That’s because we had internet at school. At home cdroms were still king compared to dialup.
Sometimes corporations take this too far and “force movement” for the wrong motivations. This is a re-post, but yeah it shows how consumers can get fed up with too many changes:
Why Every New Macbook Needs A Different Goddamn Charger
https://www.youtube.com/watch?v=jyTA33HQZLA
There needs to be a moderate balance.
I was on vacation with relatives, my brother wanted to show his pictures on the TV but could not because his macbook pro didn’t have the required HDMI port. My cheap acer laptop did and worked perfectly. I setup a shared folder for him to drop pictures into, but since apple decided ethernet ports aren’t important, the copy had to be done wirelessly, and it wasn’t going to be done for a couple hours. So we ended up copying the files to a USB thumb drive (which thankfully apple hadn’t gotten rid of yet). You can spin it any way you want, but by forcing us to resort to the sneaker-net, apple failed us that day.
Edited 2016-06-22 22:11 UTC
Well, now the ‘cloud’ [ma’ cloud] IS the network. What else will they come with?
I think one of the better decisions Apple has made with the current (retina) MacBook Pro line is to include HDMI on all models. That is certainly one area they did not move fast enough.
As far as falling back to the sneaker net, you should have been able to use an ad hoc wifi network to share the files wirelessly. (Doesn’t sound like you are an active OS Xuser, so you probably didn’t know this feature existed or didn’t think of it at the time.) It’s not like the typical user is going to be traveling with a Ethernet cable so an Ethernet port wouldn’t help out many users in your same situation.
Again, design is all about trade-offs. Look at what Apple has used the space with on the MacBook Pro line: SD card reader, USB ports, Thunderbolt ports. At least for my needs these are much better uses of the space then a Ethernet port. (For what it’s worth I’ve owned a Thunderbolt to Ethernet adapter for years now and I don’t think I’ve used it in at least a year.)
tantalic,
This past year I was in a hotel with a co-worker and we had a very large ~150GB dataset to transfer. It should have taken about 30min with gigabit ethernet, but he forgot his dongle at home. We started a wifi the transfer but it was intolerably slow. The sad thing is that I anticipated this and purchased a 128GB usb3 sandisk ahead of time exactly because of my earlier experiences with the MBP, which once again forced us to resort to a sneakernet.
I get what you are saying, but frankly non-macbookpro users would have been ready to go. Honestly I’d be annoyed if my laptop needed dongles to interface with the rest of the world. While it’s true that I have more demanding needs then a “typical user”, especially in dealing with servers, but this thing has “pro” in the name. Someone might want to transfer large video files or whatever, it shouldn’t be too much to ask for a laptop with built in ethernet IMHO.
Edited 2016-06-23 13:49 UTC
Altman,
That is fair criticism and I can’t disagree that the trade-offs Apple has selected for the MacBook Pro line does not align well with your needs. However, I also don’t think it’s fair to say that because it has “pro” in the name it MUST have some feature that your workflow needs.
Again, for my needs (and I would suspect at least a majority of pro users) the trade offs make sense. To loose any of the ports I mentioned to gain Ethernet would be a great disservice to my needs every day.
(For what it’s worth you can also use Thunderbolt to go machine to machine which should be 10-40x Gigabit speeds, although at that point your disk speed will likely be your bottleneck.)
tantalic,
Nah, other laptops can have all those ports PLUS ethernet PLUS a dvdrom PLUS VGA PLUS media readers, etc. If anything, the laptop’s thickness is the main physical constraint , which I think we can both agree on. Going thinner also affects battery life (and/or performance) and keyboard depth as well. There’s always going to be pros and cons – to each their own!
Edited 2016-06-23 23:01 UTC
“…As far as falling back to the sneaker net…”
Hadn’t got that lingo. “falling back” ;D
Ooooh God!
Surely a more balanced view is possible. I had mentioned some other pros before.
http://www.osnews.com/comments/29250?view=flat&threshold=0&sort=com…
And of course there are pros of sticking with the ubiquitous analog standard we already have. I’m a bit confused why people are finding this topic, of all things, to be so polarizing?
Edited 2016-06-22 21:16 UTC
TBH I don’t do much with audio, but my desktop speakers right now exhibit some AC hum because the analog cables happen to pass along the power.
Integrated audio cards in generic PC desktops are total and utter crap. MB designers do not isolate it from digital circuits, so you can hear CPU, etc.
I can’t even plug headphones at work PC as it can’t drive them.
For your desktop, there’s a much simpler solution:
Buy a USB extension and one of these $3 (free shipping) USB audio adapters and use the shortest possible analog cable between it and your device:
https://www.dx.com/p/usb-3d-sound-adapter-color-assorted-5831
(Seriously. Even if you just plug it into your front panel USB jacks and leave the analog cable the same, it’s far less noisy than most motherboard microphone jacks.)
Edited 2016-06-22 21:33 UTC
So you really think your $3 USB dongle is giving you a high quality digital/analog converter?
I tend to only use the ADC, so most of my experience is with that side of it, but I can honestly say that it produces a massive SNR improvement over every onboard microphone jack I’ve compared.
I actually keep a few in my box of parts to give away whenever a friend’s Skype call sounds uncomfortably noisy. They fix that too.
“but my desktop speakers right now exhibit some AC hum…” Unplug, turn 180º an replug. If persisting then check for good ground [Your neutral should also be linked to ground at utility feed].
If persisting at this point then blame your speakers.
Are you sure your problem isn’t related to grounding ? Unless your power and audio cords go parallel for a long length or your computer is really a guzzler sucker it sounds unlikely that induction could be at play. If you are sure the problem is related to induction then try to use shielded twisted wires if you can not move the cords apart, it helps a lot on cases like this.
acobar,
Haha, there’s a lot of interest in my little audio problem… thanks, all
It’s actually a powerstrip connecting just about everything around me. The weird thing is that it comes and goes, you may be right about there being a grounding issue, shielded cables couldn’t hurt either. The speakers are off most of the time anyways, so it’s not like a high priority. If anyone were to see my setup here, it’s embarrassingly modest… I probably have more servers than most people though.
FYI: I had a similar issue with the audio hum, so I bought one of these and the hum was completely eliminated:
https://www.amazon.com/gp/product/B019393MV2/ref=oh_aui_detailpage_o…
Bad audio always a hell, Alfman. Good audio is almost a LOST ART.
For a silent rig experience, recommending the acknowledged http://silentpcreview.com/
So you mean to say you want to buy new speakers that have a digital connection?
Good news everyone! There already is (and has been for a loooong long time) something called SPDIF and most motherboards already support and have this connection.
I have a 2.1-set from the year 2000 (Cambridge Soundworks Digital – link to review: http://www.anandtech.com/show/525) with a coaxial SPDIF connection and that works great on my current motherboard: a Gigabyte GA-MA770-UD2 rev2.0.
Some motherboards only support optical SPDIF however, so be aware when you buy your things that they are compatible.
Gargyle, speaker cones have caducity, even if you have given them a good life.
Gigabyte have some MotherBoards for the casual audiophile…
Yes, unfortunately the 3.5 ports on most computers, including MacBook Pro are badly designed / isolated. We recently had an issue with the buzzing at an important presentation.
However in addition to $3 USB dongles, many “gaming” motherboards isolate the audio circuits, since heatsets are assumed to be always worn at those usage scenarios. This might be true for gaming laptops as well, but they tend to be heavier.
Another alternative is getting a proper DAC, but they are usually more expensive ($100+). So if you’re interested only in getting rid of the buzz, yes I would also suggest a cheap USB dongle.
One argument I think is valid is the “gall sheet” argument. The 3.5 jack assembly package (~6mm) need to be in the bezel or _under_ the screen. This is a problem for 2 reasons. First, if the screen+glass is >=2mm, then is doesn’t fit. The other reason is the physical stress.
The headphone jack 1/8 connector will induce some Z-axis pressure on the display assembly when is is bent. It will act as a lever and will push against the OLED assembly, killing the fragile pixels below the jack. The USB-C and/or lightning connector is too short to cause a level effect.
So if the iPhone 7 is indeed bezel-less, the Jack1/8 has to go.
/me doesn’t have, or care about, iPhones
Moto Z has magnetic connect speakers being JBL SoundBoost. Now on the Moto Z phones a Moto Mod block could be done for the 3.5 ports.
Now going to a setup where 3.5 ports magnetic connect could be interesting. Note I said ports having more than 1 3.5 port would be useful for those doing audio on device.
Yes there is some truth to 3.5 ports being a nightmare for water proofing. Sorry to say so are USB C and Lightning.
One reason to remove 3.5 ports from device is reduce impact/ lever issue. Again this says nuke the USB C and Lightning as well go to a flat contacts on back with magnetic join.
I would not be looking at this as such what the of USB C/Lightning and 3.5 ports were all gone to flat contacts on back with magnetic back panel to have them. Yes this would mean impact/user stupidity on those cable connects would be able disconnect the magnets so prevent any more device damage.
Problem with the dongle idea is you risk lever length of the USB C/Lightning port + the lever length of the 3.5 port added to each other equalling more broken devices.
Yes removing 3.5 ports from the core of a phone makes sense along with USB C/Lightning ports and swapping over to flat contacts held connected by magnets.
You write “it will take a miracle for the iPhone 7 or even 8 or 9 to be thinner than 3.5mm”
You forget that the hole is 3.5 mm, you have to count the size of the outside size of the 3.5 mm hole. Looking at the current iphone, you have the thickness of the front of the phone, the thickness of the back of the phone and then you have the outside diameter of the 3.5 mm hole. It all adds up and you need an phone that is something like 7 mm to even fit that 3.5 m hole in it.
Except that there are a lot of other stuff that add to the thickness of the phone, making that 7mm minimum a limit, at least, currently. You front glass is 2-3mm in thickness, the PCB around 1.5mm and a processor will probably be another 2mm in thickness. All-and-all, you’re hitting the 7mm minimum thickness.
I get that trade-offs need to be made when designing a complete product, but there’s a point when the negative aspects start outweighing the positive. Wasn’t it the iPhone 6S that had an issue where is bent in the user’s pocket? I haven’t heard anything else about that, so it could have been fake reports.
Some dots….are they connected?
Wearable computing is probably going to be a big area of both innvation and development in the tech industry.
Currently the most common tech items people wear are headphones on their head.
Currently those headphones plug into a ‘stupid’ port that cannot feasibly be used to support a complx system of data and interaction.
Perhaps with the headphones plugged into a ‘clever’ port a new arena of interaction and new complex systems could be supported.
I have no idea whether any of that is relevant or reflects anything that Apple are planning or working on – but it seems to me that the key difference between an old fashioned headphone jack and lightening port is that one is stupid and one can be clever.
As to whether ditching the old port will cause a consumer rebellion – I am somewhat sceptical – a few years back I was firmly told many times that a computer without a floppy drive would simply be rejected by the consumer and we all know how that panned out.
Apple has been promoting the ideea of wireless technologies from some time now. Just look at the new 12 inch macbook, keyboard and mouse. Everything is expected to be done wireless-ly (besides charging but even that is probably under discussion).
The same thing will probably apply to headphones. There is a market for fashionable bluetooth headphones out there and for those whinning about better audio, there will be the new lightning / usb-c port that will provide enhanced audio quality but I highly doubt this will come at an affordable price point.
Wires are simply not practical for those listening to music on the go or for taking calls.
I keep hearing a reason for ditching the 3.5mm jack being thickness.
Is 6.9mm too thick for anyone? Is that really an issue?
Competitive fashion issues, those ones. FlyingJester.
I already thought anything much below 10mm starts to feel too thin for my liking. But fashion tells me I’m wrong. Seriously, there is no need for phones to get thinner. Battery technology isn’t advancing quickly enough to support such anaemic batteries – a 9mm thick phone instead of a 7mm thick phone could triple battery life.
Until I can unroll my screen from my small cylindrical pocket device, probably not (See ‘earth: final conflict communicator). We’re getting close to that tech being practical.
Until now there have been no comment from apple regarding the rumors of removal of the said port ( nothing unusual here ),
As far as I know the only corroborating evidence was a MFi certification program for Lightning connected headphone, which from my point of view only mean that they want to add some quality control to another way to output audio on iOS devices ( there is already some people using external DAC with the USB adapter ).
Additionally doesn’t anything that connect to the lightning port require a certification ?
The first outed “smartphones” without such port are Android ones ( the moto Z afaik ), and we could all agree that Android phone manufacturer are slinging new “features” to see whatever would stick ( and I am mainly an android user ).
Please tell me if I am missing some infos/confirmations, because I feel people are making a mountain out of thin air ( again nothing unusual here ).
I always use portable DACs with my iPhone (right now a Mojo) so I don’t give a fuck about the 3.5mm jack because the integrated ones usually suck.
IMHO people are making drama about 3.5mm jack and it is NOT the point, the point is keeping a way to get plain PCM stereo stream (no-DRM, no proprietary shit)
As long as Apple keep a plain stereo PCM digital output (trough the lightening port for example)… what’s the problem!?!?
Companies will integrate the DAC into the headphones with a lightening connector and that’s all… and people with regular phones can use a dongle (and if you are an audiophile and use very expensive headphones like me… well, it’s very likely that you already use an external DAC anyway).
So it’s not a big deal.
And there is the problem.
People carrying their own DACs are a tiny minority. Most people use the 3.5mm jack, so really the best option for consumers would be to have both available, keeping both groups of consumers happy.
Sure and keeping both is great, but removing 3.5mm jack is not a big deal as long as Apple keep a not encumbered lineal PCM output… We are creating a drama of something pretty stupid.
Regular people will buy $20 headphones with a lightening connector and that’s all… and high-end headphones owners will keep using external DAC/amps as they always did.
Is this “ideal”? I dunno, but IMHO this is not a real problem. A lot of digital devices dropped analog outputs and nobody cared about it (dvd players, videogame consoles, computers, ecc). But now, this is big drama just because the word “Apple” is associated to it… c’mon… there are better wars to fight.
“But now, this is big drama just because the word “Apple” is associated to it… c’mon… there are better wars to fight.”
Seen the water start boiling at the bottom and don’t like that, at all.
This is a stupid fight, so stupid because is so late. And it’s not just about you, Apple. [Where are my damn vinyls?]
I love vinyls too. It’s not digital or vinyl… it’s digital AND vinyl.
There’s no need to fight at all.
“… it’s digital AND vinyl.
There’s no need to fight at all.”
Maybe you don’t see it, Sergio. Those are being ripped apart. Analog ‘freaks’ are being expurgated from digital ‘paradise’. Because hubris, money & power.
And you are right at this: We are not going to let analog to die. Always easy to ‘mount’ digital over analog.
That’s an scape door needed to keep decency over the waterline.
On a small propositive chat to the Consortium: Analog and its natural deterioration-on-copy WAS the better frontier against UNLIMITED pirate chains, limiting the hopping to 3 or 4 nodes, before turning the product into a torture instrument.
As for the Intel Community: Analog allow direct copy of private non-products to 3 or 4 nodes. [Device rendering of non-products should be always analog].
That’s great, but why don’t you show/pastebin your typical loadout of portable DACs and phones, audio mixing breakout, compression modules, MIDI boxes etc. (X yes I would like to jog with a mixing board as a part of casual gymnastic strength training) while not swallowing your Pelican case explaining why:
a) Apple would permit (break its own oaths, Pro Audio cred. etc. to) an uncompressed PCM stream to clog their nice port
b) Give credentials out that BangGood vendors and shoppers can use; and not revoke iLuv’s or Griffin’s (for example) when Apple wants to throw shade
c) What’s the wireless lightning protocol again
d) When was the last time you broke and then repaired the Lightning connector?
Also, in fall when Bluetooth 5.0 comes out with 4x the bandwidth and range, does that do something for you or is it still below (FLAC/Monkey/other data companding and) DAC-on-earbuds snuff?
I find this issue to be much like a storm in a teacup. It’s already the third story on the front page on this topic and everything people had to say was said already.
For myself, when phone manufacturer will all move to digital audio connectors I will buy a cheap USB pair of earbuds and go on with like, I expect those to have a price in the same ballpark as their analog counterparts. And I don’t care what Apple will do, their customers seems to have a fetish for overpriced stuff.
It is just another reason to avoid Apple gadgets, as the lightning port in general is.
They are overestimating the leverage they have on the market. They might have a 50% share of the market in the US. But outside it is much less, tendency falling. They will not manage to force their proprietary standard on the world. It will just make their products more awkward in the long run.
It reminds me of this
1984: “No command line?!â€
1998: “No floppy drive?!â€
2008: “No CD drive!?â€
2010: “No Flash?!â€
2015: “No USB!?”
2016: “No headphone jack!?”
It seems dinosaurs never went extinct – they just change their mating call.
And in what way is this supposed to be an argument for a digital headphone jack? For the sake of changing to digital? For the sake of changing something every few years?
Actually, in high end music recording they are beginning to switch back to analog tape. Like this guy: http://www.blabbermouth.net/news/slash-in-the-studio-real-to-reel-w…
You Apple people still haven;t managed to give me one decent argument to remove the 3.5mm jack other than “Apple did it with floppies too!”, which are two completely and utterly different situations.
After so many articles and so many comments across so many websites, I do declare that there is no argument to do so, other than the user-hostile reasons such as lock-in, DRM, and control.
Why do the Apple people need to come up with a reason? Motorola (so, it’d be Android) already did away with it there. Have them answer first, since they did it first this generation (plenty of others did it prior to this generation). Also, it’s not the people doing away with it, it’s the companies.
Personally, I like my Veer and I use it with bluetooth, but I don’t listen to music on it. It’s just used as a phone. I have headphones on my iPhone, but again, don’t listen to music with them, just an occasional video.
I see no reason to change it, as the market is saturated with a proven technology, no matter it’s age.
Edited 2016-06-23 15:29 UTC
Hey, I’m an Apple person. Well, not really, but I use an iPhone, and I personally hope Apple keep the 3.5mm jack, make an official announcement that they’ve got no plans to abandon it, and make all these misinforming Apple bloggers/fanboys (along with Motorola) look like fools in the process.
See my other comment
One of the biggest reasons to dump the unnecessary 3.5mm connector is real estate, which inside a cellphone casing is at an absolute premium. Dumping the 3.5mm connector is not about adding space for a bigger battery, and it’s not about all the idiotic DRM FUD being slung about like monkeys do their excrement. It’s simply getting rid of something that’s unnecessary and wasting space where slivers actually matter.
People don’t seem to realize how big the 3.5mm connector is internally. Take a look at some cellphone tear down pics and compare the connector to other components. Dumping it offers a lot of redesign potential.
you are right about the size being a big part of this. maybe the cover.
things aren’t either or, corporate departments get things along with other departments sometimes.
will we see them design a smaller OPEN, FREE port for audio? probably not.
So exactly which “decent argument” can you give to justify the removal of the floppy disk? That would not apply, literally letter by letter, to also justify the removal of the 3.5” jack?
Wow, you sure love being told what you don’t need throughout the years and doing what you’re told.
Apple die hard fans are exceedingly good at justifying (to themselves) why paying more for less functionality is a privilege…
Think different*
* In a pre-approved way.
The difference is that most of them made sense at some level or another. GUI is a massive step up from the CLI for most average users. Floppy and CD drives, while still common, were rapidly declining. Flash is pants, and also declining. USB is probably the only one there that’s vaguely comparable, and yet I’d suspect there are more people using headphones with a 3.5mm jack than there are using USB every day. I know it’s the case for me anyway, even though I’m a heavy computer user.
Wait… Apple removed the USB ports? Well, that’s another reason I’d never buy a MacBook.
Even my OpenPandora (ARM-based palmtop) has a full-sized USB A jack to complement the USB OTG port and separate charging port and I use it all the time. (And the successor, the Pyra, is going to have one USB 3.0 OTG, two full-sized USB 2.0, one of which can output eSATA with an adapter, and a second USB micro port for charging or serial debug console. They were even planning to use an adapterless USB/eSATA port until it got EOLed.)
It’s bad enough that the $80 Android tablet I bought for testing websites came with a wired-only keyboard and a charger. (Guess what, geniuses: I don’t want to be swapping plugs every few minutes so I can type AND keep the damn thing charged!)
…and yes, the $80 tablet does have a MicroSD slot and I do use it.
Edited 2016-06-23 17:01 UTC
You may want to be careful with a few of your examples:
Hmm, USB type C isn’t USB? You might want to double check that one…
This is not about thinness or robustness or digital rights.
It’s about multiple devices.
For instance, only bluetooth works with continuity and Siri/Music/IP phones/videos/etc. You’re NOT going to have Siri on your speakers in a work environment, and you’re gonna use it from both your phone, ipad and your Mac.
And of course the watch, which will be a 4g standalone device soon enough.
Bluetooth and multi pairing is the only way, and this is the *real* reason why Apple is doing this.
At any rate, I’m glad Apple is putting a bullet in the head of, not the 3.5mm jack per se, but physical tethering of my head to a specific device. Think about how stupid this really is. Apple is right.
And given how extremely well my latest BT head set sounds, I’m happy about this. Audiophiles don’t use phones as a source anyway.
Edited 2016-06-23 08:28 UTC
You’re right but people are still going to latch on to those stupid excuses because they’re easy to defend against. There’s a reason you don’t see the people whining about thinness or DRM addressing any of the real dialog about this.
Can please somebody give me a valid reasons for not using VGA anymore?
Or CD-ROM drives?
Or floppy disks?
Or PS/2 ports?
Probably it’s the same reasons.
About DRM: I don’t know why you miss the point that it’s an analog output but the phone it’s all digital, why wouldn’t they apply DRM in the phone directly if they really want to, analog output or not?
With PS/2 keyboards you can press as many keys as you wish at once and they will all register while USB maxes out at 6.
Newer isn’t always better.
That sounds particular to a device driver or peripheral chip, but okay, what multiplayer 1-keyboard games or other things get you to need more than 6 simultaneous keypresses?
I am imagining that you are Yoked Galileo reinventing interval workouts on a bespoke Cherry keyboard, or the principal audience for 4-player Galaxian or Space Fortress-like games’ keyboard-only interfaces.
The valid reason for those is – there are better technologies around.
The 3.5mm audio jack? Not so much. The 3.5mm jack is ubiquitous, standard, works well to the point where it’s no longer noticed, and therefore very convenient. The supposed next step up isn’t the same as going from PS/2 to USB, floppy disks to cd-roms to flash.
This change is a sideways step rather than a forward step. Like minidiscs.
Because they can then control what headphones are used, so you need to buy a set of headphones or a DAC officially licenced by the vendor. Sure, you can probably buy unapproved ones, but in that case you won’t hear your music, which would drastically limit your options.
Then you’re not talknig about DRM, you’re talking about MFi approved hardware which is a different thing.
they will never talk about DRM, they prefer not to.
It’s that simple.
From the article;
Actually, I’m pretty sure there’d be a DAC chip in the iPhone even if the 3.5mm jack got removed. They’d still need to convert audio for the phone’s speakers, after all.
Rendering music is awesome. When phones and computers could first do it it was pretty amazing. Very convenient.
But they don’t render music properly. Even with lossless files, the hardware is not built to render music accurately or pleasantly.
Not to mention all the interruptions from the thousands of other features on the device.
Buy a DAP. A ponoplayer costs under $500 and will render digital audio perfectly, with no interruptions or upgrades or DRM or apple nonsense, for many years. It will play all your current files and all the better ones you will own in the future.
Buy lossless music. Support the artists and the beauty of their art.
Edited 2016-06-23 13:30 UTC
So like the ones at http://gearpatrol.com/2015/05/14/6-best-portable-hi-fi-music-player… I should carry or build into furniture something that I have to weave my own case for, get dick and ribs for compression and other effects I’d like, have bespoke repair notes, and only replicates the cost of the iDevice a few times; rather than just pick a phone with decent DACs or those in the Grados, or the Bluetooth phones that play lossless stuff directly?
Mm. Fine advice for headline soloists, maybe instrumental professors and coaches who have to change flash media often. Why don’t these things have multiple slots, for $500? For $600 you should have ENDLESS microSD slots and 8 headphone jacks, that each transcribe notes from the headset mic.
All those devices are pretty nice. Each will do the trick. I prefer the ponoplayer at $400 but it has limited availability outside the US.
I don’t understand your building it into furniture comment. Those things fit in pockets, purses, bags, etc. just like an iPod.
an iDevice with a contract costs damn near $100/month for many people. a DAP is a single purchase that should run for 10+ years, and with the PonoPlayer the battery is easily swappable a few years down the road.
Edited 2016-06-24 20:14 UTC
Now I have everyone attention 😀
I saw a lot of articles about apple planning to remove the 3.5mm jack, and no one seems to care about the fact that an android vendor (motorola) already shipped a product without the headphones jack.
This is not about apple, this is a stupid move of the entire industry.
I don’t care if they introduce a new port to replace the 3.5mm stereo minijack (much)
i think it’s pretty stupid, as they could make it a dual digital/analogue design easily (they’ve had dual digital/optical 3.5mm ports in iMacs many time). But that said – the internal volume the port requires could be smaller.. but it’s a hard argument. I’d sooner remove the speakers or at least a stereo one at the mic end. Come on if you want stereo – use a proper stereo I’m not being a SQ nerd, just the spatial separating offered from two tiny drivers 11cm apart or so is negligible.
The MAIN this is that whatever the design, Lightning or USB-C or Other, it should have a capability to output Analogue too (at least at CD-quality – and it you have DRM free track on your device, I see zero reason it should have at least 96/24 quality DAC and half decent amp too). I’m fine if super high quality output (192/24 PCM or DSD) is only via digital, even DRM’d if they must.
They HAVE to have some CD(&ideally higher) quality audio output as analogue on the same port (and easily) send out via a simple adapter to a 3.5mm female minijack port — otherwise there basically giving two fingers to the owners (and manufacturers) of current high quality headphones. particular high quality earbuds.
High quality over ear overings almost always need a bit more current than phones internal amps can offer and generally need a separate headphone amp, even a portable one.
DVI was dual, until (Monetizing) Consortium decided not anymore. Come on. Money has been the one talking all this time. Stop trying to kid every kid.
Right now in doubt if They killed analog in the name of ecology.
There is another issue that I have with this too. If they do this in the phone, when does this become standard in the laptops and desktops? You know that will happen too. Forget the headphones, EVERY guitar, keyboard and mic I own use a 1/4″ or 1/8″ or xlr input. With various (cheap) adaptors I can make anything work for a few bucks. But if Apple has lightning connection in mind for audio???
No way. Heck, right now, every time they update logic or a incremental OS release none of my DAWs work for like a month or more until drivers are updated. Apple is lousy at audio. Who cares if you have great software if you have underpowered magical machines with underpowered USB C ports and a amazing lightning audio connecter that you can’t get any audio into without a ($29.99us) adaptor every time you want to plug something in… Uggg..
Apple didn’t like it that Beats and Sony could sell peripherals such as Hi-Fi headphones for their device without giving Apple a cut. Can’t we just agree on that and give this subject a rest?
Period, end if story, that’s the reason.
Apple wants to get a cut from anything that is connected or run on their iPhone. Even the Web isn’t safe, with Apple blocking ads on webpages but not news reading apps (they don’t get a cut from the first but they do from the second or at least can exercise that option any moment they want).
Unfortunately most Android OEMS will find that ditching the 3.5mm jack saves a couple of square millimetres of area and ditch it too. I expect Sammy to keep the jack though like they did with the Micro SD, making it easier for the average guy and earning their #1 position in sales for yet another year, despite the nerd cries for “bloatware”.
For the other OEMs though, before we know it, we ‘ll be back to Nokia’s horrible “pop-port” which unplugged itself…
Crap.
Should I get a Nexus 5 or 5X before this stupidity hits Android shores?? (I prefer the 5X but heard it has lag and compass issues, some people say they were fixed with and update, some not)
Edited 2016-06-23 15:58 UTC
Until they pull an S6, you mean?
1 – vendor control – through drivers
2 – DRM – using signed encoding like MQA
3 – hardware sales – new headphones, new DAC dongles
4 – more features through the port – navigation, sensor data
5 – smaller size port
I’ve been on this for years now. Not happy about it but it appears to be our future reality.
I like when Apple kills expired tech.
I hate when Apple tries to kill perfectly good audio tech. No phone is capable of pushing analog audio better than a 3.5mm jack can handle. This is not for quality improvement, it is for business reasons only.
Edited 2016-06-23 16:01 UTC
Apple doesn’t like that you can use the Square POS system on the 3.5mm and the fruit wants its share!
Edited 2016-06-23 16:06 UTC
I don’t really care about this subject personally, as it will not have a real effect on sound quality. The poor digital-to-analog converter integrated into phones will be replaced by even worse cheap crap integrated into the headphones.
But since that crap now has to be built-in into every single headphone set out there, I’m afraid this change will just lead to increased electronic waste, which obviously is a horrible direction to take.
Of course companies can sell “adapters” (i.e. soundcards) to connect the regular headphones into the digital interface on the phone, but usually people don’t want to carry cables and stuff with them everywhere they go.
Edited 2016-06-23 18:56 UTC
THIS.
…… that the suggestion that a modern computing platorm might phase out an analog port and replace it with a digital port would freak out so many techies
Because many of those techies actually understand what the terms “digital” and “analog” mean, perhaps?
they are both loaded terms. i agree it takes basic understanding of how audio flows through a circuit to get any of this, and most OS computer types assume incorrectly when it comes to audio.
it’s about why they are phasing it out, tony.
they are trying to bypass the analog port because it’s the last unlocked cheap standard port in the world. the music labels and publishers are behind apple on this because they want DRM back for future generations. they can’t have it with an analog port and they can’t have it with FLAC or WAV files.
music content owners accept that they lost the first round of DRM, which tried to build DRM over top of the codecs and file formats.
for the 2nd round they are going to put the DRM in the codec, meaning the DAC will only convert files it approves of.
Edited 2016-06-23 21:35 UTC
“for the 2nd round they are going to put the DRM in the codec, meaning the DAC will only convert files it approves of. ”
DACs are so technologically trivial, how could this approach not be the next day pasture?
Well it’s not shipped yet! I think the whole concept could fail. Not many here seem very happy about it. Content owners want DRM back, even for streaming, and they know it won’t work with another layer of 3rd party software managing it like was tried 10-20 years ago.
MQA format has the ability to put DRM into the encoding of the file itself with a DAC keyed to determine legality of the file before decoding it, making it much harder to get around. They are selling it as a way to offer higher-quality streams within the file, but it’s DRM under any guise.
Remember this MQA stuff is all theorized, not proven. Meridian are marketing with lots of hype but very little detail. Very few people have even heard the format in action.
I have no evidence that Apple is sniffing around MQA but it seems obvious to me. Apple also seems to be planning to get out of the download business and MQA gives them the ammo for “HD streaming” or whatever they will call it.
It’s all numbers folks (storage/bandwidth) — Apple’s AAC “Mastered for iTunes” program will accept 24bit audio and sells it as 24/48 lossy AAC, with an effective bitrate slightly higher than a CD. This allows them to compete – or at least not fall too far behind – the Hi-Res crowd, at around 1300k bitrate AAC’s.
MQA claims to offer a similar sound quality at around 500k bitrate, allowing them to easily stream. And there’s all those other ‘business considerations’ built into the format.
Basically, MQA is the result of someone who studied MP3 compression code for 2 decades and believed they had the right theory but chose completely incorrect (or non-musical) details – stuff like where/how the filters are applied, how the perceptual coding implements loss, how it hides its loss artifacts, how the frequency bands interact with each other, how the whole thing is stored and unwrapped before playing… it’s almost a full rewrite of mp3 compression. It focuses on timing cues, spatial cues, soundstage, and timbre, things MP3 seemed to ignore and thus obliterate.
I find MQA fascinating but 10-15 years too late to market. We now have the bandwidth to stream 16/44 lossless PCM. Most of us have the bandwidth to stream 24bit lossless music, at least up to 96k sample rates. (None of us do because no one offers it yet).
MQA solves a problem from a decade ago that’s not really a problem at all. But the public have shown they want convenience over quality time and time again, and MQA attempts to bring quality into a tiny little streamable file. That of course has people interested.
I still believe the real force behind this is sales of new hardware with DRM built in, and the pitch that they can now stream “HD” at 500k and leave PCM behind – since PCM has no DRM built in.
Edited 2016-06-24 12:53 UTC
Update — just read an interview with Neil Young last week and he says Tidal is going to MQA, which is now backed by Warner!
OK Damn, that changes things. I wonder if Apple will follow Tidal’s lead or stick with 256k AAC. Apple has ALAC too, which would be easy to sell through their download store but I think they are getting out of download music business.
http://wp.me/a2MP5A-1RJ
load that up and check it – a nice list of various audio files:
FORMAT – EFFECTIVE BITRATE – BITDEPTH – SAMPLERATE
– the m4a 6 down is an apple store purchase in the mastered for iTunes program. It’s a 24/48 m4a which is lossy. Very strange middle ground to get 1372 bitrate, about the same as 16/44.
– see how the aif is uncompressed, making the effective bitrate almost double the flac’s around it at the same quality
– hopefully looking at that list makes you wonder if your 320k files have everything in them.
I don’t want to sound all conspiracy theorist, but Apple joined the Bluetooth Special Interest Group in 2015.
I believe that this is an investment to ultimately try to make this a standard or make some bank while they can, and ultimately save on money and space while pushing for bluetooth.
I don’t know why on earth they’d do this since at this point the 3.5mm jack is like…. legendary, but I can see it as a way to do what they do best: Have people so invested in their proprietary stuff that they can never do it affordably and monetizing it.
At best, the newest bluetooth standard (The one with WIFI) uses as much power as the 3.5mm, but this move is just… anti-environment due to the addition of batteries on the devices that would use it instead of cables.
I don’t know what to say really besides those two. There is no upside either for the user or everyone else, except for Apple, and like all corporations, all they do is for money, ultimately.
Change has to happen otherwise we will never grow. It would be better though if Apple were to go the USB-C route instead of their proprietary connector. But I do thing that the 3.5 mm headphone jack has been around too long, its time things got smaller and digital.
you mean cheaper and crappier…. ?
they’re not doing ANYTHING to help the customers. if they’d just worked on USB-C instead of wasting time on lightning, ditching the 3.5mm jack wouldn’t be as much a problem