Given a set of sensors or cluster of sensors S located at different points or nodes in the ordinary space. Any sensor measures one or more values, such as Temperature. We assume that the information from all sensors at different positions in the space is transmitted to a Gateway node as a probabilistic phenomena, not in a deterministic way. The measured value X at the Gateway sensor node is a random value. Noise in the network randomly changes the original measurements. Information at the gateway is given by a distribution of the probability at the gateway sensor. We can show that given the values at the sensor node the distribution of the probability at the gateway changes. So the sensor measurements are parameters that define the distribution of the values at the gateway. The probability at the gateway is conditioned by the original measures at the sensor node. The probability approach cannot take care of the topology of the network but only of the conditional probability at the gateway conditioned by the sensors. Now we compute the derivative of the conditional Boltzmann entropy for any variation of the sensor value and for any value at the gateway X. This matrix gives us the sensor situation so we can compute the Fisher information of the sensor. It is the Hessian of the entropy average function in the space of the sensors S. The Fisher information gives us the geometry or form of the sensor space S. Sensor information is very important to obtain the form of the phenomena that we want to measure with the different sensors. Networks of sensors with their geometry go beyond the individual sensor that measures only one value and cannot discover the field or form of the physical phenomena.