Score
Title
625
How To Search ELI5: A Quick Reminder About Rule 7
1366
ELI5: Why do microscopic organisms (bacteria etc.) look like they're CGI under a microscope.
7
ELI5: Our body fight diseases by increasing the temperature, why betraying it by cooling ourselves down?
92
ELI5: Behaviorism and how it is used to teach/educate children
4
Eli5: Why do humans have different voices?
6
ELI5: if human skin cells reproduce and you essentially have different skin than you did 5 years ago, why do scars never disappear?
5
ELI5:How do bacteria photosynthesize if they don't have membrane bound organelles (chloroplasts)?
3
ELIF: What is the source of heat for the Earths core?
3
ELI5:What makes an MR layout car more likely to oversteer?
2
ELI5: What's the difference between a savings and checking account and is it important?
2
ELI5: What is the science behind ‘beer goggles’?
2
ELI5: Why is the Earth's core so hot? If the sun went out, would the core's remain hot without the sun's energy?
1
ELI5: What happens to our muscles when we 'pull' our neck?
2
ELI5:Numbers Stations
2
ELI5: why does sugar look like a rock and make rock candy if it comes from a plant?
6
ELI5:Why can't humans hold themselves perfectly still without twitching, hence the difficulty of the game "operation"?
4
ELI5: How were cartoons in the early 2000's animated?
5
ELI5: Why are the numbers on super market scales arranged counter clockwise?
2
ELI5: Why is Denuvo still a thing when people are constantly pirating games that use it?
5
ELI5 - What is code and how does it work?
3
ELI5: The difference between nerves and neurons.
11
ELI5:How do our bodies acclimate to hot/cold temperatures?
18
ELI5: Why do Third World Countries have problems with possessing water, when the earth is 79% of it and we have the technology to purify water?
3
ELI5: How do scientists accurately reconstruct the faces of archeologically recovered skulls?
0
ELI5: why does cigarette smoke give people lung cancer and kill them but weed smoke doesn't?
0
ELI5: Why can't you cheat the lottery by doing this?
1
ELI5: Why is GRILLED chicken healthy and fried chicken not?
0
ELI5: Why are members of the armed services credited with upholding American Freedom?
1
ELI5:what is DOM elements and how it works in a web page?
1
ELI5: The controversy and theory of Jordan Peterson.
1
ELI5: Why is medical cannabis not normally used, when morphine is freely accepted?
0
ELI5: Are the odds of a baby being a boy always 50/50?
0
ELI5: What happens if you run away from police in a car and get away?
1
ELI5: If we cannot technically touch anything because the particles we’re made of repel each other, what’s in the space between them?
2
ELI5: why is the speed of earth slowest when it is farthest away from the sun in the elliptical path and vice versa?
0
ELI5: Why hasn’t Quebec gone independent yet?
0
ELI5: how do they make crisps taste like actual flavours?
3
ELI5 why things get black as they combust?
1
ELI5: Aspect ratio and dimensions
2
ELI5: How does sandboxing (computer security) work?
2
ELI5: images(math)
20680 xilefian Eyy I actually know the answer to this one (game & app developer with low-level expertise in power and memory management - lots of iOS and Android experience and knowledge). --- Android was built to run Java applications across any processor - X86, ARM, MIPS, due to decisions made on the early days of Android's development. Android first did this via a virtual-machine (Dalvik), which is like a virtual computer layer between the actual hardware and the software (Java software in Android's case). Lots of memory was needed to manage this virtual machine and store both the Java byte-code and the processor machine-code as well as store the system needed for translating the Java byte-code into your device's processor machine-code. These days Android uses a Runtime called ART for interpreting (and compiling!) apps - which still needs to sit in a chunk of memory, but doesn't consume nearly as much RAM as the old Dalvik VM did. Android was also designed to be a multi-tasking platform with background services, so in the early days extra memory was needed for this (but it's less relevant now with iOS having background-tasks). Android is also big on the garbage-collected memory model - where apps use all the RAM they want and the OS will later free unused memory at a convenient time (when the user isn't looking at the screen is the best time to do this!). --- iOS was designed to run Objective-C applications on known hardware, which is an ARM processor. Because Apple has full control of the hardware, they could make the decision to have native machine code (No virtual machine) run directly on the processor. Everything in iOS is lighter-weight in general due to this, so the memory requirements are much lower. iOS originally didn't have background-tasks as we know them today, so in the early days it could get away with far less RAM than what Android needed. RAM is expensive, so Android devices struggled with not-enough-memory for quite a few years in the early days, with iOS devices happily using 256MB and Android devices struggling with 512MB. In iOS the memory is managed by the app, rather than a garbage collector. In the old days developers would have to use alloc and dealloc to manage their memory themselves - but now we have automatic reference counting, so there is a mini garbage collection system happening for iOS apps, but it's on an app basis and it's very lightweight and only uses memory for as long as it is actually needed (and with Swift this is even more optimised). --- **EXTRA** (for ages 5+): What does all this mean? Android's original virtual machine, Dalvik, was built in an era when the industry did not know what CPU architecture would dominate the mobile world (or if one even would). Thus it was designed for X86, ARM and MIPS with room to add future architectures as needed. The iPhone revolution resulted in the industry moving almost entirely to use the ARM architecture, so Dalvik's compatibility benefits were somewhat lost. More-so, Dalvik was quite battery intensive - once upon a time Android devices had awful battery life (less than a day) and iOS devices could last a couple of days. Android now uses a new Runtime called Android RunTime (ART). This new runtime is optimised to take advantage of the target processors as much as possible (X86, ARM, MIPS) - and it is a little harder to add new architectures. ART does a lot differently to Dalvik; it stores the translated Java byte-code as raw machine-code binary for your device. ~~This means apps actually get faster the more you use them as the system slowly translates the app to machine-code. Eventually, only the machine code needs to be stored in memory and the byte-code can be ignored (frees up a lot of RAM).~~ ([This is Dalvik, not ART](https://www.reddit.com/r/explainlikeimfive/comments/7pvzmu/eli5_what_does_ios_do_differently_to_android_for/dskrltj/)). Art compiles the Java byte-code during the app install (how could I forget this? Google made such a huge deal about it too!) but these days it also uses a JIT interpreter similar to Dalvik to save from lengthy install/optimisation times. In recent times, Android itself has become far more power aware, and because it runs managed code on its Runtime Android can make power-efficiency decisions across all apps that iOS cannot (as easily). This has resulted in the bizarre situation that most developers thought they'd never see where Android devices now tend to have longer battery life (a few days) than iOS devices - which now last less than a day. The garbage collected memory of Android and its heavy multi-tasking still consumes a fair amount of memory, these days both iOS and Android are very well optimised for their general usage. The OS tend to use as much memory as it can to make the device run as smoothly as possible and as power-efficient as possible. Remember task managers on Android? They pretty much aren't needed any more as the OS does a fantastic job on its own. Task killing in general is probably worse for your phone now as it undoes a lot of the spin-up optimisation that is done on specific apps when they are sent to the background. iOS gained task killing for some unknown reason (probably iOS users demanding one be added because Android has one) - but both operating systems can do without this feature now. The feature is kept around because users would complain if these familiar features disappear. I expect in future OS versions the task-killers won't actually do anything and will become a placebo - or it will only reset the app's navigation stack, rather than kills the task entirely.
784 kf97mopa There are several reasons relating to the varying use cases as others have described, but the main reason is this: Android uses a form of automatic memory management that uses garbage collection, while iOS uses a more manual form of memory management. Garbage collection works better if there is always a good chunk of memory free, so the garbage collector doesn't have to run so often. https://en.wikipedia.org/wiki/Garbage_collection_(computer_science) The reason to use garbage collection is because it saves the programmer from manually having to managed memory. Memory management is tricky, and if you make a mistake, you might begin to leak memory (memory consumption goes up slowly) or create a security hole. Recent versions of iOS use something called automated reference counting, which means that the compiler (technically the pre-processor) will figure the correct memory management automatically. This means that the workload of managing memory moves from the phone to the computer of the developer that compiles the software. The reason for this difference is historical. Android uses the Dalvik runtime, which borrows from Java, while iOS uses Objective-C and now Swift, which had a simple manual memory management system (manual reference counting). Apple used Objective-C because that is what they use in their own OS - Google used a Java analogue because it is a modern safe language that was widely by the time they launched Android, and so was easy for developers to learn.
716 dont_forget_canada I believe the true answer to this question is fascinating, and that it's actually just one piece in a bigger scenario (playing out **right now** that started in 1993) and that all of us are about to witness a transformation in the personal PC space that a lot of people wont see coming. First, lets focus on why the history of apple as a company put them in the position they're in today where they build everything in-house and it seems to work so well for them. Apple has the upper hand here when it comes to optimizing the software and hardware in a way that Google can never have, because Apple is calling all the shots when it comes to OS, CPU design, and device design. Google doesn't have that luxury. Google builds one piece of the handset (OS) and have to make it work in tandem with many other companies like Samsung, Qualcomm and Intel (for the radio). This is a very difficult task and is why OEMs like Samsung often have to also contribute a lot on the software side when building something like the S8. The reason Apple is in this position (where it can control the entire hardware/software creation of the device) is twofold. On the one hand Steve Jobs always wanted to control the software and hardware aspects of the Macintosh because he saw that it made it easier to provide users with better UX this way, and also the more control he could exert over the users the better. The other fascinating **and often overlooked but incredibly important** reason why Apple can do what they do with the iPhone has to do with IBM, PowerPCs and a little known company called P.A. Semi. You see, up until around 2006 Apple used PowerPC CPUs (by IBM) instead of x86 (by Intel). It is believed by most that Apple switched to Intel because Intel made more powerful chips that consumed less power. This isn't actually completely true. IBM is who made PowerPC design/chips and by the time 2006 rolled around IBM had sold off thinkpad, OS/2 had failed and they were almost fully out of the consumer space. IBM was completely focused on making large power hungry server class CPUs and here was Apple demanding small power efficient PowerPC CPUs. IBM had no incentive towards making such a CPU and it got so bad with Apple waiting on IBM that they ended up skipping an entire generation of PowerBooks (G5). Enter P.A. Semi. A "startup for CPU design" if there ever was one. This team seemingly came out of nowhere and created a series of chips called PWRficient. As IBM dragged its feet, this startup took the PowerPC specification and designed a beautifully fast, small and energy efficient PowerPC chip. In many cases it was far better than what Intel had going for them and it was wildly successful to the point where the US military still uses them in some places today. Anyway, their PowerPC processor was exactly what Apple was looking for, which came at a time when IBM had basically abandoned them, and Apple NEEDED this very bad. So what did Apple do? they **bought** P.A. Semi. They bought the company. So at this point if you're still reading my giant block of text you're probably wondering *but if Apple bought the company who could solve their PowerPC problem, why did they still switch to Intel?* And that's where the story goes from just interesting to fascinating: Apple immediately put the team they had just bought in charge of creating the CPUs for *the iphone*. See, people always ask *when is Apple going to abandon the Mac?* well the real answer is that they abandoned the Mac when they switched to Intel, because this was the exact time when they not only gave up but *abandoned a perfect solution* to the Mac's CPU problem, and where they instead re-purposed that solution to make sure that they **never have** a CPU problem with the iPhone. So what lessons did Apple learn here? That if a critical component to your device (i.e. CPU) is dependent on another company then it can throw your entire timeline off track and cost you millions in revenue lost (the powerbook g5 that never happened). Apple was smart enough to know that if this was a problem for the Mac it could also be a problem for the iPhone. When a solution arrived for the Mac they instead applied it to the iPhone instead, to make sure there was **never** a problem. And that team from P.A. Semi has designed Apples ARM CPUs for the iPhone ever since, and they're at least two generations ahead of the chips Android devices generally use, because they were first to market with a 64bit architecture, and first to allow the use of "big" and "little" cores simultaneously. And as for Mac users? Well, the switch to Intel allowed the Mac to keep living, but MacOS now comes second to iOS development, and new Mac hardware is quite rare. Apple has announced plans for app development that is cross compatible with iOS *and* MacOS. Apple has started shipping new Macs along with a *second* ARM CPU. The iPad Pro continues to gain MacOS like features such as the dock, file manager, multi-window/split support. All signs point to MacOS being on life support. When Steve Jobs introduced MacOS he said it was the OS we would all be using for the next 20 years, and guess what? Time's almost up. And the irony of it all is that history has now repeated: Apple now has the same problem they had with IBM, but now with Intel. Intel is now failing to produce chips that are small enough and that run cool enough. Apple will have to redesign the internals of the MacBook to support 8th gen chips due to changes intel made. Even the spectre/meltdown bug. The Mac is yet again dependent on a CPU manufacture in a way that harms Apple. So yes, the iPhone **is** something to marvel at in terms of its performance. You might be thinking Android is the big loser here, but really it's the Mac and it's Intel. I believe we at the cusp of an event that will make the IBM/PowerPC drama seem small. In five years from now we likely wont even recognize what MacOS and Windows are anymore, and Intel will either exit from the portable consumer space, or they will have to go through an entire micro-architectural re-design and rescue themselves as they did in '93 with the Pentium. In '93 Intel almost got destroyed because their CISC chips weren't as powerful as RISC chips such as PowerPC. Intel then released Pentium, which is essentially a RISC chip (think PowerPC or ARM) but with a heavy duty translation layer bolted on top to support CISC instructions that every Windows PC required. This rescued Intel up until *right now* but the industry has evolved and Intel's "fix" in '93 is now their biggest problem for two reasons: 1) they physically can't compete speed/heat/size with ARM now because they have to drag along this CISC translation layer that ARM doesn't need; and 2) Windows is about to introduce native ARM support with a **software translation layer**. Remember, Microsoft has the same CPU dependency problem that Apple has. And Microsoft's software solution allows them to throw away Intel for something better. Users wont notice the switch to ARM because it's transparent, but they will notice the 20 hours of battery life and thinner devices they get in the future once Intel is gone.
20676 0 xilefian Eyy I actually know the answer to this one (game & app developer with low-level expertise in power and memory management - lots of iOS and Android experience and knowledge). --- Android was built to run Java applications across any processor - X86, ARM, MIPS, due to decisions made on the early days of Android's development. Android first did this via a virtual-machine (Dalvik), which is like a virtual computer layer between the actual hardware and the software (Java software in Android's case). Lots of memory was needed to manage this virtual machine and store both the Java byte-code and the processor machine-code as well as store the system needed for translating the Java byte-code into your device's processor machine-code. These days Android uses a Runtime called ART for interpreting (and compiling!) apps - which still needs to sit in a chunk of memory, but doesn't consume nearly as much RAM as the old Dalvik VM did. Android was also designed to be a multi-tasking platform with background services, so in the early days extra memory was needed for this (but it's less relevant now with iOS having background-tasks). Android is also big on the garbage-collected memory model - where apps use all the RAM they want and the OS will later free unused memory at a convenient time (when the user isn't looking at the screen is the best time to do this!). --- iOS was designed to run Objective-C applications on known hardware, which is an ARM processor. Because Apple has full control of the hardware, they could make the decision to have native machine code (No virtual machine) run directly on the processor. Everything in iOS is lighter-weight in general due to this, so the memory requirements are much lower. iOS originally didn't have background-tasks as we know them today, so in the early days it could get away with far less RAM than what Android needed. RAM is expensive, so Android devices struggled with not-enough-memory for quite a few years in the early days, with iOS devices happily using 256MB and Android devices struggling with 512MB. In iOS the memory is managed by the app, rather than a garbage collector. In the old days developers would have to use alloc and dealloc to manage their memory themselves - but now we have automatic reference counting, so there is a mini garbage collection system happening for iOS apps, but it's on an app basis and it's very lightweight and only uses memory for as long as it is actually needed (and with Swift this is even more optimised). --- **EXTRA** (for ages 5+): What does all this mean? Android's original virtual machine, Dalvik, was built in an era when the industry did not know what CPU architecture would dominate the mobile world (or if one even would). Thus it was designed for X86, ARM and MIPS with room to add future architectures as needed. The iPhone revolution resulted in the industry moving almost entirely to use the ARM architecture, so Dalvik's compatibility benefits were somewhat lost. More-so, Dalvik was quite battery intensive - once upon a time Android devices had awful battery life (less than a day) and iOS devices could last a couple of days. Android now uses a new Runtime called Android RunTime (ART). This new runtime is optimised to take advantage of the target processors as much as possible (X86, ARM, MIPS) - and it is a little harder to add new architectures. ART does a lot differently to Dalvik; it stores the translated Java byte-code as raw machine-code binary for your device. ~~This means apps actually get faster the more you use them as the system slowly translates the app to machine-code. Eventually, only the machine code needs to be stored in memory and the byte-code can be ignored (frees up a lot of RAM).~~ ([This is Dalvik, not ART](https://www.reddit.com/r/explainlikeimfive/comments/7pvzmu/eli5_what_does_ios_do_differently_to_android_for/dskrltj/)). Art compiles the Java byte-code during the app install (how could I forget this? Google made such a huge deal about it too!) but these days it also uses a JIT interpreter similar to Dalvik to save from lengthy install/optimisation times. In recent times, Android itself has become far more power aware, and because it runs managed code on its Runtime Android can make power-efficiency decisions across all apps that iOS cannot (as easily). This has resulted in the bizarre situation that most developers thought they'd never see where Android devices now tend to have longer battery life (a few days) than iOS devices - which now last less than a day. The garbage collected memory of Android and its heavy multi-tasking still consumes a fair amount of memory, these days both iOS and Android are very well optimised for their general usage. The OS tend to use as much memory as it can to make the device run as smoothly as possible and as power-efficient as possible. Remember task managers on Android? They pretty much aren't needed any more as the OS does a fantastic job on its own. Task killing in general is probably worse for your phone now as it undoes a lot of the spin-up optimisation that is done on specific apps when they are sent to the background. iOS gained task killing for some unknown reason (probably iOS users demanding one be added because Android has one) - but both operating systems can do without this feature now. The feature is kept around because users would complain if these familiar features disappear. I expect in future OS versions the task-killers won't actually do anything and will become a placebo - or it will only reset the app's navigation stack, rather than kills the task entirely.
784 0 kf97mopa There are several reasons relating to the varying use cases as others have described, but the main reason is this: Android uses a form of automatic memory management that uses garbage collection, while iOS uses a more manual form of memory management. Garbage collection works better if there is always a good chunk of memory free, so the garbage collector doesn't have to run so often. https://en.wikipedia.org/wiki/Garbage_collection_(computer_science) The reason to use garbage collection is because it saves the programmer from manually having to managed memory. Memory management is tricky, and if you make a mistake, you might begin to leak memory (memory consumption goes up slowly) or create a security hole. Recent versions of iOS use something called automated reference counting, which means that the compiler (technically the pre-processor) will figure the correct memory management automatically. This means that the workload of managing memory moves from the phone to the computer of the developer that compiles the software. The reason for this difference is historical. Android uses the Dalvik runtime, which borrows from Java, while iOS uses Objective-C and now Swift, which had a simple manual memory management system (manual reference counting). Apple used Objective-C because that is what they use in their own OS - Google used a Java analogue because it is a modern safe language that was widely by the time they launched Android, and so was easy for developers to learn.
716 0 dont_forget_canada I believe the true answer to this question is fascinating, and that it's actually just one piece in a bigger scenario (playing out **right now** that started in 1993) and that all of us are about to witness a transformation in the personal PC space that a lot of people wont see coming. First, lets focus on why the history of apple as a company put them in the position they're in today where they build everything in-house and it seems to work so well for them. Apple has the upper hand here when it comes to optimizing the software and hardware in a way that Google can never have, because Apple is calling all the shots when it comes to OS, CPU design, and device design. Google doesn't have that luxury. Google builds one piece of the handset (OS) and have to make it work in tandem with many other companies like Samsung, Qualcomm and Intel (for the radio). This is a very difficult task and is why OEMs like Samsung often have to also contribute a lot on the software side when building something like the S8. The reason Apple is in this position (where it can control the entire hardware/software creation of the device) is twofold. On the one hand Steve Jobs always wanted to control the software and hardware aspects of the Macintosh because he saw that it made it easier to provide users with better UX this way, and also the more control he could exert over the users the better. The other fascinating **and often overlooked but incredibly important** reason why Apple can do what they do with the iPhone has to do with IBM, PowerPCs and a little known company called P.A. Semi. You see, up until around 2006 Apple used PowerPC CPUs (by IBM) instead of x86 (by Intel). It is believed by most that Apple switched to Intel because Intel made more powerful chips that consumed less power. This isn't actually completely true. IBM is who made PowerPC design/chips and by the time 2006 rolled around IBM had sold off thinkpad, OS/2 had failed and they were almost fully out of the consumer space. IBM was completely focused on making large power hungry server class CPUs and here was Apple demanding small power efficient PowerPC CPUs. IBM had no incentive towards making such a CPU and it got so bad with Apple waiting on IBM that they ended up skipping an entire generation of PowerBooks (G5). Enter P.A. Semi. A "startup for CPU design" if there ever was one. This team seemingly came out of nowhere and created a series of chips called PWRficient. As IBM dragged its feet, this startup took the PowerPC specification and designed a beautifully fast, small and energy efficient PowerPC chip. In many cases it was far better than what Intel had going for them and it was wildly successful to the point where the US military still uses them in some places today. Anyway, their PowerPC processor was exactly what Apple was looking for, which came at a time when IBM had basically abandoned them, and Apple NEEDED this very bad. So what did Apple do? they **bought** P.A. Semi. They bought the company. So at this point if you're still reading my giant block of text you're probably wondering *but if Apple bought the company who could solve their PowerPC problem, why did they still switch to Intel?* And that's where the story goes from just interesting to fascinating: Apple immediately put the team they had just bought in charge of creating the CPUs for *the iphone*. See, people always ask *when is Apple going to abandon the Mac?* well the real answer is that they abandoned the Mac when they switched to Intel, because this was the exact time when they not only gave up but *abandoned a perfect solution* to the Mac's CPU problem, and where they instead re-purposed that solution to make sure that they **never have** a CPU problem with the iPhone. So what lessons did Apple learn here? That if a critical component to your device (i.e. CPU) is dependent on another company then it can throw your entire timeline off track and cost you millions in revenue lost (the powerbook g5 that never happened). Apple was smart enough to know that if this was a problem for the Mac it could also be a problem for the iPhone. When a solution arrived for the Mac they instead applied it to the iPhone instead, to make sure there was **never** a problem. And that team from P.A. Semi has designed Apples ARM CPUs for the iPhone ever since, and they're at least two generations ahead of the chips Android devices generally use, because they were first to market with a 64bit architecture, and first to allow the use of "big" and "little" cores simultaneously. And as for Mac users? Well, the switch to Intel allowed the Mac to keep living, but MacOS now comes second to iOS development, and new Mac hardware is quite rare. Apple has announced plans for app development that is cross compatible with iOS *and* MacOS. Apple has started shipping new Macs along with a *second* ARM CPU. The iPad Pro continues to gain MacOS like features such as the dock, file manager, multi-window/split support. All signs point to MacOS being on life support. When Steve Jobs introduced MacOS he said it was the OS we would all be using for the next 20 years, and guess what? Time's almost up. And the irony of it all is that history has now repeated: Apple now has the same problem they had with IBM, but now with Intel. Intel is now failing to produce chips that are small enough and that run cool enough. Apple will have to redesign the internals of the MacBook to support 8th gen chips due to changes intel made. Even the spectre/meltdown bug. The Mac is yet again dependent on a CPU manufacture in a way that harms Apple. So yes, the iPhone **is** something to marvel at in terms of its performance. You might be thinking Android is the big loser here, but really it's the Mac and it's Intel. I believe we at the cusp of an event that will make the IBM/PowerPC drama seem small. In five years from now we likely wont even recognize what MacOS and Windows are anymore, and Intel will either exit from the portable consumer space, or they will have to go through an entire micro-architectural re-design and rescue themselves as they did in '93 with the Pentium. In '93 Intel almost got destroyed because their CISC chips weren't as powerful as RISC chips such as PowerPC. Intel then released Pentium, which is essentially a RISC chip (think PowerPC or ARM) but with a heavy duty translation layer bolted on top to support CISC instructions that every Windows PC required. This rescued Intel up until *right now* but the industry has evolved and Intel's "fix" in '93 is now their biggest problem for two reasons: 1) they physically can't compete speed/heat/size with ARM now because they have to drag along this CISC translation layer that ARM doesn't need; and 2) Windows is about to introduce native ARM support with a **software translation layer**. Remember, Microsoft has the same CPU dependency problem that Apple has. And Microsoft's software solution allows them to throw away Intel for something better. Users wont notice the switch to ARM because it's transparent, but they will notice the 20 hours of battery life and thinner devices they get in the future once Intel is gone.