Score
Title
858
How To Search ELI5: A Quick Reminder About Rule 7
14644
ELI5: Why do cars travel in packs on the highway, even when there are no traffic stops to create groups?
11
ELI5: Why are our fingertips, ear holes and nostrils all approximately the same size?
7
ELI5: Why do body parts (fingers, eyes, etc) twitch randomly?
11
ELI5 Retarded Time
4
ELI5: Umbral Moonshine
4
ELI5: Why do jet skis shoot a stream of water straight up when they go forward?
7
ELI5: Why are some sounds, like nails on a chalkboard, so universally hated by humans?
4
ELI5: How does ethylene make fruits and vegetables ripen faster?
3
ELI5: Why are solar gardens good investments for wealthy people?
3
ELI5: Why does a copied URL dirrct me yo a different page?
2
ELI5: Why is it not blinding to look directly at the sun early in a sun rise or late into a sun set?
2
ELI5 why is the polar star always north?
12
ELI5: How many ants does it take to make a functioning ant colony?
17
ELI5: SD. SS. SA. Gestapo. Wehrmacht. Sipo. Kripo. What were they all and how do they relate to each other?
4
ELI5: If I use the same amount of coffee grounds but more water, does my caffeine content change?
2
ELI5: Why do animals appear to care so much for their young, but not so much when the 'children' get older?
3
ELI5; What is the difference between a break and a fracture?
6
ELI5: Air movement in a house
2
ELI5: Why does the wind typically pick up during the middle of the day and die down in the evenings?
1
ELI5: Why does NASCAR race on oval tracks, rather than the more complicated layouts of other motorsports?
4
ELI5: How can sperm cells "swim" through something as thick as seminal fluid? You wouldn't be able to swim through honey for example.
4
ELI5: Why can't there be an "universe's point of reference" in relativism?
2
ELI5: Why do our eyes lose focus after staring at something for a while?
5
ELI5: Why/how can most species of animals hold their breath underwater for far longer than humans can?
1
ELI5: Risk Parity strategies in investing - how they work and what are the advantages/disadvantages?
1
ELI5: How is a bank started?
2
ELI5:Why is eating healthy 80% of being healthy?
1
ELI5: How is it possible to perceive a game servers tick rate going from 30 to 60 when your ping is not that fast?
5
ELI5: Objectively, what are the limitations of carbon dating?
1
ELI5: Security as a Service
0
ELI5: How do space shuttles launch off the modified 747s?
1
ELI5: What the hell is Umbral Moonshine?
8
ELI5: Why are cones and pyramids exactly 1/3 of a cylinder or prism's volume?
1
ELI5: How much does food affect building strength?
6
ELI5: What causes exhausts to have that rasp-y sound people tend to associate with tuners? (civics, integras, etc)
2
ELI5: Why is it stated sharks will suffocate if they quit swimming, but I see examples like the white-tipped reef shark who spend the day laying on the bottom?
0
ELI5: Why are steroids more popular in baseball than football or basketball?
0
ELI5: Why does the body make women throw up or get nauseous when pregnant?
16
ELI5: When and why did 8 hours of sleep become the standard for a solid night’s rest?
2
ELI5:How some stars become pulsars?
20796 xilefian Eyy I actually know the answer to this one (game & app developer with low-level expertise in power and memory management - lots of iOS and Android experience and knowledge). --- Android was built to run Java applications across any processor - X86, ARM, MIPS, due to decisions made on the early days of Android's development. Android first did this via a virtual-machine (Dalvik), which is like a virtual computer layer between the actual hardware and the software (Java software in Android's case). Lots of memory was needed to manage this virtual machine and store both the Java byte-code and the processor machine-code as well as store the system needed for translating the Java byte-code into your device's processor machine-code. These days Android uses a Runtime called ART for interpreting (and compiling!) apps - which still needs to sit in a chunk of memory, but doesn't consume nearly as much RAM as the old Dalvik VM did. Android was also designed to be a multi-tasking platform with background services, so in the early days extra memory was needed for this (but it's less relevant now with iOS having background-tasks). Android is also big on the garbage-collected memory model - where apps use all the RAM they want and the OS will later free unused memory at a convenient time (when the user isn't looking at the screen is the best time to do this!). --- iOS was designed to run Objective-C applications on known hardware, which is an ARM processor. Because Apple has full control of the hardware, they could make the decision to have native machine code (No virtual machine) run directly on the processor. Everything in iOS is lighter-weight in general due to this, so the memory requirements are much lower. iOS originally didn't have background-tasks as we know them today, so in the early days it could get away with far less RAM than what Android needed. RAM is expensive, so Android devices struggled with not-enough-memory for quite a few years in the early days, with iOS devices happily using 256MB and Android devices struggling with 512MB. In iOS the memory is managed by the app, rather than a garbage collector. In the old days developers would have to use alloc and dealloc to manage their memory themselves - but now we have automatic reference counting, so there is a mini garbage collection system happening for iOS apps, but it's on an app basis and it's very lightweight and only uses memory for as long as it is actually needed (and with Swift this is even more optimised). --- **EXTRA** (for ages 5+): What does all this mean? Android's original virtual machine, Dalvik, was built in an era when the industry did not know what CPU architecture would dominate the mobile world (or if one even would). Thus it was designed for X86, ARM and MIPS with room to add future architectures as needed. The iPhone revolution resulted in the industry moving almost entirely to use the ARM architecture, so Dalvik's compatibility benefits were somewhat lost. More-so, Dalvik was quite battery intensive - once upon a time Android devices had awful battery life (less than a day) and iOS devices could last a couple of days. Android now uses a new Runtime called Android RunTime (ART). This new runtime is optimised to take advantage of the target processors as much as possible (X86, ARM, MIPS) - and it is a little harder to add new architectures. ART does a lot differently to Dalvik; it stores the translated Java byte-code as raw machine-code binary for your device. ~~This means apps actually get faster the more you use them as the system slowly translates the app to machine-code. Eventually, only the machine code needs to be stored in memory and the byte-code can be ignored (frees up a lot of RAM).~~ ([This is Dalvik, not ART](https://www.reddit.com/r/explainlikeimfive/comments/7pvzmu/eli5_what_does_ios_do_differently_to_android_for/dskrltj/)). Art compiles the Java byte-code during the app install (how could I forget this? Google made such a huge deal about it too!) but these days it also uses a JIT interpreter similar to Dalvik to save from lengthy install/optimisation times. In recent times, Android itself has become far more power aware, and because it runs managed code on its Runtime Android can make power-efficiency decisions across all apps that iOS cannot (as easily). This has resulted in the bizarre situation that most developers thought they'd never see where Android devices now tend to have longer battery life (a few days) than iOS devices - which now last less than a day. The garbage collected memory of Android and its heavy multi-tasking still consumes a fair amount of memory, these days both iOS and Android are very well optimised for their general usage. The OS tend to use as much memory as it can to make the device run as smoothly as possible and as power-efficient as possible. Remember task managers on Android? They pretty much aren't needed any more as the OS does a fantastic job on its own. Task killing in general is probably worse for your phone now as it undoes a lot of the spin-up optimisation that is done on specific apps when they are sent to the background. iOS gained task killing for some unknown reason (probably iOS users demanding one be added because Android has one) - but both operating systems can do without this feature now. The feature is kept around because users would complain if these familiar features disappear. I expect in future OS versions the task-killers won't actually do anything and will become a placebo - or it will only reset the app's navigation stack, rather than kills the task entirely.
783 kf97mopa There are several reasons relating to the varying use cases as others have described, but the main reason is this: Android uses a form of automatic memory management that uses garbage collection, while iOS uses a more manual form of memory management. Garbage collection works better if there is always a good chunk of memory free, so the garbage collector doesn't have to run so often. https://en.wikipedia.org/wiki/Garbage_collection_(computer_science) The reason to use garbage collection is because it saves the programmer from manually having to managed memory. Memory management is tricky, and if you make a mistake, you might begin to leak memory (memory consumption goes up slowly) or create a security hole. Recent versions of iOS use something called automated reference counting, which means that the compiler (technically the pre-processor) will figure the correct memory management automatically. This means that the workload of managing memory moves from the phone to the computer of the developer that compiles the software. The reason for this difference is historical. Android uses the Dalvik runtime, which borrows from Java, while iOS uses Objective-C and now Swift, which had a simple manual memory management system (manual reference counting). Apple used Objective-C because that is what they use in their own OS - Google used a Java analogue because it is a modern safe language that was widely by the time they launched Android, and so was easy for developers to learn.
734 dont_forget_canada I believe the true answer to this question is fascinating, and that it's actually just one piece in a bigger scenario (playing out **right now** that started in 1993) and that all of us are about to witness a transformation in the personal PC space that a lot of people wont see coming. First, lets focus on why the history of apple as a company put them in the position they're in today where they build everything in-house and it seems to work so well for them. Apple has the upper hand here when it comes to optimizing the software and hardware in a way that Google can never have, because Apple is calling all the shots when it comes to OS, CPU design, and device design. Google doesn't have that luxury. Google builds one piece of the handset (OS) and have to make it work in tandem with many other companies like Samsung, Qualcomm and Intel (for the radio). This is a very difficult task and is why OEMs like Samsung often have to also contribute a lot on the software side when building something like the S8. The reason Apple is in this position (where it can control the entire hardware/software creation of the device) is twofold. On the one hand Steve Jobs always wanted to control the software and hardware aspects of the Macintosh because he saw that it made it easier to provide users with better UX this way, and also the more control he could exert over the users the better. The other fascinating **and often overlooked but incredibly important** reason why Apple can do what they do with the iPhone has to do with IBM, PowerPCs and a little known company called P.A. Semi. You see, up until around 2006 Apple used PowerPC CPUs (by IBM) instead of x86 (by Intel). It is believed by most that Apple switched to Intel because Intel made more powerful chips that consumed less power. This isn't actually completely true. IBM is who made PowerPC design/chips and by the time 2006 rolled around IBM had sold off thinkpad, OS/2 had failed and they were almost fully out of the consumer space. IBM was completely focused on making large power hungry server class CPUs and here was Apple demanding small power efficient PowerPC CPUs. IBM had no incentive towards making such a CPU and it got so bad with Apple waiting on IBM that they ended up skipping an entire generation of PowerBooks (G5). Enter P.A. Semi. A "startup for CPU design" if there ever was one. This team seemingly came out of nowhere and created a series of chips called PWRficient. As IBM dragged its feet, this startup took the PowerPC specification and designed a beautifully fast, small and energy efficient PowerPC chip. In many cases it was far better than what Intel had going for them and it was wildly successful to the point where the US military still uses them in some places today. Anyway, their PowerPC processor was exactly what Apple was looking for, which came at a time when IBM had basically abandoned them, and Apple NEEDED this very bad. So what did Apple do? they **bought** P.A. Semi. They bought the company. So at this point if you're still reading my giant block of text you're probably wondering *but if Apple bought the company who could solve their PowerPC problem, why did they still switch to Intel?* And that's where the story goes from just interesting to fascinating: Apple immediately put the team they had just bought in charge of creating the CPUs for *the iphone*. See, people always ask *when is Apple going to abandon the Mac?* well the real answer is that they abandoned the Mac when they switched to Intel, because this was the exact time when they not only gave up but *abandoned a perfect solution* to the Mac's CPU problem, and where they instead re-purposed that solution to make sure that they **never have** a CPU problem with the iPhone. So what lessons did Apple learn here? That if a critical component to your device (i.e. CPU) is dependent on another company then it can throw your entire timeline off track and cost you millions in revenue lost (the powerbook g5 that never happened). Apple was smart enough to know that if this was a problem for the Mac it could also be a problem for the iPhone. When a solution arrived for the Mac they instead applied it to the iPhone instead, to make sure there was **never** a problem. And that team from P.A. Semi has designed Apples ARM CPUs for the iPhone ever since, and they're at least two generations ahead of the chips Android devices generally use, because they were first to market with a 64bit architecture, and first to allow the use of "big" and "little" cores simultaneously. And as for Mac users? Well, the switch to Intel allowed the Mac to keep living, but MacOS now comes second to iOS development, and new Mac hardware is quite rare. Apple has announced plans for app development that is cross compatible with iOS *and* MacOS. Apple has started shipping new Macs along with a *second* ARM CPU. The iPad Pro continues to gain MacOS like features such as the dock, file manager, multi-window/split support. All signs point to MacOS being on life support. When Steve Jobs introduced MacOS he said it was the OS we would all be using for the next 20 years, and guess what? Time's almost up. And the irony of it all is that history has now repeated: Apple now has the same problem they had with IBM, but now with Intel. Intel is now failing to produce chips that are small enough and that run cool enough. Apple will have to redesign the internals of the MacBook to support 8th gen chips due to changes intel made. Even the spectre/meltdown bug. The Mac is yet again dependent on a CPU manufacture in a way that harms Apple. So yes, the iPhone **is** something to marvel at in terms of its performance. You might be thinking Android is the big loser here, but really it's the Mac and it's Intel. I believe we at the cusp of an event that will make the IBM/PowerPC drama seem small. In five years from now we likely wont even recognize what MacOS and Windows are anymore, and Intel will either exit from the portable consumer space, or they will have to go through an entire micro-architectural re-design and rescue themselves as they did in '93 with the Pentium. In '93 Intel almost got destroyed because their CISC chips weren't as powerful as RISC chips such as PowerPC. Intel then released Pentium, which is essentially a RISC chip (think PowerPC or ARM) but with a heavy duty translation layer bolted on top to support CISC instructions that every Windows PC required. This rescued Intel up until *right now* but the industry has evolved and Intel's "fix" in '93 is now their biggest problem for two reasons: 1) they physically can't compete speed/heat/size with ARM now because they have to drag along this CISC translation layer that ARM doesn't need; and 2) Windows is about to introduce native ARM support with a **software translation layer**. Remember, Microsoft has the same CPU dependency problem that Apple has. And Microsoft's software solution allows them to throw away Intel for something better. Users wont notice the switch to ARM because it's transparent, but they will notice the 20 hours of battery life and thinner devices they get in the future once Intel is gone.
20797 0 xilefian Eyy I actually know the answer to this one (game & app developer with low-level expertise in power and memory management - lots of iOS and Android experience and knowledge). --- Android was built to run Java applications across any processor - X86, ARM, MIPS, due to decisions made on the early days of Android's development. Android first did this via a virtual-machine (Dalvik), which is like a virtual computer layer between the actual hardware and the software (Java software in Android's case). Lots of memory was needed to manage this virtual machine and store both the Java byte-code and the processor machine-code as well as store the system needed for translating the Java byte-code into your device's processor machine-code. These days Android uses a Runtime called ART for interpreting (and compiling!) apps - which still needs to sit in a chunk of memory, but doesn't consume nearly as much RAM as the old Dalvik VM did. Android was also designed to be a multi-tasking platform with background services, so in the early days extra memory was needed for this (but it's less relevant now with iOS having background-tasks). Android is also big on the garbage-collected memory model - where apps use all the RAM they want and the OS will later free unused memory at a convenient time (when the user isn't looking at the screen is the best time to do this!). --- iOS was designed to run Objective-C applications on known hardware, which is an ARM processor. Because Apple has full control of the hardware, they could make the decision to have native machine code (No virtual machine) run directly on the processor. Everything in iOS is lighter-weight in general due to this, so the memory requirements are much lower. iOS originally didn't have background-tasks as we know them today, so in the early days it could get away with far less RAM than what Android needed. RAM is expensive, so Android devices struggled with not-enough-memory for quite a few years in the early days, with iOS devices happily using 256MB and Android devices struggling with 512MB. In iOS the memory is managed by the app, rather than a garbage collector. In the old days developers would have to use alloc and dealloc to manage their memory themselves - but now we have automatic reference counting, so there is a mini garbage collection system happening for iOS apps, but it's on an app basis and it's very lightweight and only uses memory for as long as it is actually needed (and with Swift this is even more optimised). --- **EXTRA** (for ages 5+): What does all this mean? Android's original virtual machine, Dalvik, was built in an era when the industry did not know what CPU architecture would dominate the mobile world (or if one even would). Thus it was designed for X86, ARM and MIPS with room to add future architectures as needed. The iPhone revolution resulted in the industry moving almost entirely to use the ARM architecture, so Dalvik's compatibility benefits were somewhat lost. More-so, Dalvik was quite battery intensive - once upon a time Android devices had awful battery life (less than a day) and iOS devices could last a couple of days. Android now uses a new Runtime called Android RunTime (ART). This new runtime is optimised to take advantage of the target processors as much as possible (X86, ARM, MIPS) - and it is a little harder to add new architectures. ART does a lot differently to Dalvik; it stores the translated Java byte-code as raw machine-code binary for your device. ~~This means apps actually get faster the more you use them as the system slowly translates the app to machine-code. Eventually, only the machine code needs to be stored in memory and the byte-code can be ignored (frees up a lot of RAM).~~ ([This is Dalvik, not ART](https://www.reddit.com/r/explainlikeimfive/comments/7pvzmu/eli5_what_does_ios_do_differently_to_android_for/dskrltj/)). Art compiles the Java byte-code during the app install (how could I forget this? Google made such a huge deal about it too!) but these days it also uses a JIT interpreter similar to Dalvik to save from lengthy install/optimisation times. In recent times, Android itself has become far more power aware, and because it runs managed code on its Runtime Android can make power-efficiency decisions across all apps that iOS cannot (as easily). This has resulted in the bizarre situation that most developers thought they'd never see where Android devices now tend to have longer battery life (a few days) than iOS devices - which now last less than a day. The garbage collected memory of Android and its heavy multi-tasking still consumes a fair amount of memory, these days both iOS and Android are very well optimised for their general usage. The OS tend to use as much memory as it can to make the device run as smoothly as possible and as power-efficient as possible. Remember task managers on Android? They pretty much aren't needed any more as the OS does a fantastic job on its own. Task killing in general is probably worse for your phone now as it undoes a lot of the spin-up optimisation that is done on specific apps when they are sent to the background. iOS gained task killing for some unknown reason (probably iOS users demanding one be added because Android has one) - but both operating systems can do without this feature now. The feature is kept around because users would complain if these familiar features disappear. I expect in future OS versions the task-killers won't actually do anything and will become a placebo - or it will only reset the app's navigation stack, rather than kills the task entirely.
783 0 kf97mopa There are several reasons relating to the varying use cases as others have described, but the main reason is this: Android uses a form of automatic memory management that uses garbage collection, while iOS uses a more manual form of memory management. Garbage collection works better if there is always a good chunk of memory free, so the garbage collector doesn't have to run so often. https://en.wikipedia.org/wiki/Garbage_collection_(computer_science) The reason to use garbage collection is because it saves the programmer from manually having to managed memory. Memory management is tricky, and if you make a mistake, you might begin to leak memory (memory consumption goes up slowly) or create a security hole. Recent versions of iOS use something called automated reference counting, which means that the compiler (technically the pre-processor) will figure the correct memory management automatically. This means that the workload of managing memory moves from the phone to the computer of the developer that compiles the software. The reason for this difference is historical. Android uses the Dalvik runtime, which borrows from Java, while iOS uses Objective-C and now Swift, which had a simple manual memory management system (manual reference counting). Apple used Objective-C because that is what they use in their own OS - Google used a Java analogue because it is a modern safe language that was widely by the time they launched Android, and so was easy for developers to learn.
732 0 dont_forget_canada I believe the true answer to this question is fascinating, and that it's actually just one piece in a bigger scenario (playing out **right now** that started in 1993) and that all of us are about to witness a transformation in the personal PC space that a lot of people wont see coming. First, lets focus on why the history of apple as a company put them in the position they're in today where they build everything in-house and it seems to work so well for them. Apple has the upper hand here when it comes to optimizing the software and hardware in a way that Google can never have, because Apple is calling all the shots when it comes to OS, CPU design, and device design. Google doesn't have that luxury. Google builds one piece of the handset (OS) and have to make it work in tandem with many other companies like Samsung, Qualcomm and Intel (for the radio). This is a very difficult task and is why OEMs like Samsung often have to also contribute a lot on the software side when building something like the S8. The reason Apple is in this position (where it can control the entire hardware/software creation of the device) is twofold. On the one hand Steve Jobs always wanted to control the software and hardware aspects of the Macintosh because he saw that it made it easier to provide users with better UX this way, and also the more control he could exert over the users the better. The other fascinating **and often overlooked but incredibly important** reason why Apple can do what they do with the iPhone has to do with IBM, PowerPCs and a little known company called P.A. Semi. You see, up until around 2006 Apple used PowerPC CPUs (by IBM) instead of x86 (by Intel). It is believed by most that Apple switched to Intel because Intel made more powerful chips that consumed less power. This isn't actually completely true. IBM is who made PowerPC design/chips and by the time 2006 rolled around IBM had sold off thinkpad, OS/2 had failed and they were almost fully out of the consumer space. IBM was completely focused on making large power hungry server class CPUs and here was Apple demanding small power efficient PowerPC CPUs. IBM had no incentive towards making such a CPU and it got so bad with Apple waiting on IBM that they ended up skipping an entire generation of PowerBooks (G5). Enter P.A. Semi. A "startup for CPU design" if there ever was one. This team seemingly came out of nowhere and created a series of chips called PWRficient. As IBM dragged its feet, this startup took the PowerPC specification and designed a beautifully fast, small and energy efficient PowerPC chip. In many cases it was far better than what Intel had going for them and it was wildly successful to the point where the US military still uses them in some places today. Anyway, their PowerPC processor was exactly what Apple was looking for, which came at a time when IBM had basically abandoned them, and Apple NEEDED this very bad. So what did Apple do? they **bought** P.A. Semi. They bought the company. So at this point if you're still reading my giant block of text you're probably wondering *but if Apple bought the company who could solve their PowerPC problem, why did they still switch to Intel?* And that's where the story goes from just interesting to fascinating: Apple immediately put the team they had just bought in charge of creating the CPUs for *the iphone*. See, people always ask *when is Apple going to abandon the Mac?* well the real answer is that they abandoned the Mac when they switched to Intel, because this was the exact time when they not only gave up but *abandoned a perfect solution* to the Mac's CPU problem, and where they instead re-purposed that solution to make sure that they **never have** a CPU problem with the iPhone. So what lessons did Apple learn here? That if a critical component to your device (i.e. CPU) is dependent on another company then it can throw your entire timeline off track and cost you millions in revenue lost (the powerbook g5 that never happened). Apple was smart enough to know that if this was a problem for the Mac it could also be a problem for the iPhone. When a solution arrived for the Mac they instead applied it to the iPhone instead, to make sure there was **never** a problem. And that team from P.A. Semi has designed Apples ARM CPUs for the iPhone ever since, and they're at least two generations ahead of the chips Android devices generally use, because they were first to market with a 64bit architecture, and first to allow the use of "big" and "little" cores simultaneously. And as for Mac users? Well, the switch to Intel allowed the Mac to keep living, but MacOS now comes second to iOS development, and new Mac hardware is quite rare. Apple has announced plans for app development that is cross compatible with iOS *and* MacOS. Apple has started shipping new Macs along with a *second* ARM CPU. The iPad Pro continues to gain MacOS like features such as the dock, file manager, multi-window/split support. All signs point to MacOS being on life support. When Steve Jobs introduced MacOS he said it was the OS we would all be using for the next 20 years, and guess what? Time's almost up. And the irony of it all is that history has now repeated: Apple now has the same problem they had with IBM, but now with Intel. Intel is now failing to produce chips that are small enough and that run cool enough. Apple will have to redesign the internals of the MacBook to support 8th gen chips due to changes intel made. Even the spectre/meltdown bug. The Mac is yet again dependent on a CPU manufacture in a way that harms Apple. So yes, the iPhone **is** something to marvel at in terms of its performance. You might be thinking Android is the big loser here, but really it's the Mac and it's Intel. I believe we at the cusp of an event that will make the IBM/PowerPC drama seem small. In five years from now we likely wont even recognize what MacOS and Windows are anymore, and Intel will either exit from the portable consumer space, or they will have to go through an entire micro-architectural re-design and rescue themselves as they did in '93 with the Pentium. In '93 Intel almost got destroyed because their CISC chips weren't as powerful as RISC chips such as PowerPC. Intel then released Pentium, which is essentially a RISC chip (think PowerPC or ARM) but with a heavy duty translation layer bolted on top to support CISC instructions that every Windows PC required. This rescued Intel up until *right now* but the industry has evolved and Intel's "fix" in '93 is now their biggest problem for two reasons: 1) they physically can't compete speed/heat/size with ARM now because they have to drag along this CISC translation layer that ARM doesn't need; and 2) Windows is about to introduce native ARM support with a **software translation layer**. Remember, Microsoft has the same CPU dependency problem that Apple has. And Microsoft's software solution allows them to throw away Intel for something better. Users wont notice the switch to ARM because it's transparent, but they will notice the 20 hours of battery life and thinner devices they get in the future once Intel is gone.