There are a lot of different ways to answer this question. One answer could be that single neurons cannot store any information, that information storage is based on the state of a group of neurons in a system. In that context, we could say that a single neuron in a group of neurons is acting as a container for a single bit - each neuron is an on/off switch, but thinking of neurons as digital (0/1) is problematic because it makes the assumption that the state of a neuron is binary.
This is where the concept of an action potential becomes important. So each neuron in this memory system we are talking about has a non-binary amount of electrical energy stored in its membrane, and when the membrane reaches a certain threshold potential, this triggers a chain reaction that sends an action potential along an axon. This action potential might carry some encoded information along to another neuron, and trigger another action potential, so on and so forth, until many neurons have been triggered by this initial reaction.
I know this isn’t exactly a cut and dry, “Neurons store 10 bytes of data” answer that you might have been looking for, but at least these are some ideas to think about in regards to how it’s really a more difficult/complex question to ask than it might seem.
The neuron itself does not "store information", but its interaction with other neurons can be represented by weights, i.e. once it fires, how it increases \ decreases the polarization of the neurons at the end of its synapses.
One neuron in the human brain is directly "connected", on average to 7000 other neurons. These connections excite or inhibit the activities of the neurons at the receiving end to a varying degree determined by the type of synapse (what neurotransmitter is user, how many vesicles are secreted, how many neurotransmitters molecules per vesicle, how many receptors are present at the receiving end and what's the uptake time).
The resolution needed to describe the weight, then, is roughly the max number of vesicles it is possible to emit to the synaptic gap, times the number of different neurotransmitters, times the range of molecules per vesicle, times the max number of receptors on the receiving end times the max number of re-uptake-responsible contraptions in the synaptic gap.
The lower bound limit I can give on this resolution is about 10^15 , but it's probably much higher.
Thus a lower bound of the information encoded in the activation of a neuron would contain about 10^15*7000 possible stages, or roughly 348802 bit, or about 43kB.
So a neuron contains at least 43 kilobytes, but it probably contains much more than that. I haven't even touched upon the possible time dynamic of the neuron (which may fire in bursts, with varying number of activations and varying time gaps between each activation).
I was wondering lately when we are dreaming and we are dreaming something extremely complex, like walking in a meadow surrounded by millions of plants how does our brain render those plants. Is every plant render separately or as a group or... (render is probably terrible verb for describing this)
In [this paper](https://www.sciencedirect.com/science/article/pii/S0896627304005288
) by Brunel et al. (Neuron, 2004) they provide an estimate of 5Kb for a Purkinje cell. Note that these cells are somehow special, however: they are located in the cerebellum, rather than the cortex, and they have ~15x the number of synapses of a more common pyramidal neuron found in the cortex.
Also, the method used is reasonable, but (obviously) questionable. Without getting into too much detail: there could be alternative ways of computing the storage capacity of a neuron, for example (basically, depending on the neural code, which we don't know really). Some of the assumptions made in the paper may not hold. Surely, the "arrangement" of the network stores information as well, somehow. Etc.
On the other hand, the computations performed to derive the 5Kb figure of the paper are already quite hard and involved, and the accompanying experiments required to feed the parameters into the equations require extraordinary technical skills. (Basically, these scientists are among the best in their fields, is what I'm saying.) So having a better estimate is going to take a while, I'm afraid.
That is hard to answer, because the presumption of this question that neurons store information similar to how bits store information is just not the case.
Bits are deterministic value storage for our deterministic computation algorithms (which arguably changes with neural network style AI algorithms). A neuronal network however stores information in a network of neurons. Neurons trigger with a probability on accumulated triggering input power.
Contrary to deterministic algorithms which are planned and can be followed - input, manipulation, output - neuronal networks are too complex and/or too indirect to understand - there is input and output, but the network is not planned, crafted and studied; it is trained with input data and feedback on that input data to make it learn desired output.
So a single neuron arguably stores no information. But it does have properties like activation probability, activation energy and connections to other neurons (which is probably not what the question asks for as 'stored information').
Biology brains are not computers and they don't "store" bytes. In fact, NOTHING we know about computers really applies to brains beyond "information goes in, is processed and then goes out". Everything beyond that is completely different in every way.
Thinking of the brain being like a computer is like thinking of the human body as being like a steam engine.