Internet of Things (IoT) applications are generating tremendous amount of data which is not only extremely big, but also missing, noisy, and uncertain due to intrinsic characteristics of IoT. These phenomenons pose a number of challenges in managing the IoT network and trustworthiness of the data analytics. Specifically, transferring all IoT data to the cloud for data analytics may be costly, inefficient and infeasible in some cases. Therefore, migrating sensor data processing and analysis closer to the edge devices plays a vital role in terms of reducing the amount of data sent to the cloud, IoT service delay and network latency. In this paper, we first aim to enable deep learning models in resource constrained IoT devices. Then, we design and implement a real IoT testbed consisting of resource constrained devices. We also provide a solution to the missing sensor data problem in IoT from the perspectives of edge, fog and cloud computing. Finally, we compare all computing approaches in terms of network load, latency and delay. Experimental results show that deep learning based edge and fog computing approaches can improve network delay and bandwidth requirements greatly and efficiently.