Himanshu Savargaonkar is a budding engineer and a devoted electronics hobbyist. He is a problem solver by nature and has done multiple software and embedded projects. He is also a contributor and a supporter of the open-source community. Himanshu is currently pursuing his undergraduate degree with a major in electronics and communication and a minor in computer science at the Vellore Institute of Technology, India.
Himanshu is an amateur equestrian athlete with national-level dressage competition awards to his credit. He plays tabla, an Indian percussion instrument as his hobby in his leisure time.
Until recently, the key platforms in the embedded space were the microcontrollers, microprocessors, and FPGAs. But now the playing field is changing. With the inclusion of GPUs, AI accelerators, Blockchain Chips within the embedded ecosystem the whole field is primed to explode into the new era. I believe that this new era of embedded will be powered by and based on Edge Computing.
Edge Computing can be described as a paradigm of traditional artificial intelligence. It aims at bringing AI and ML closer to the edge. Executing the models on the edge nodes rather than on the centralized servers. This idea has many advantages. Applications of the same have started popping up in the industry at various places.
This short talk introduces the concept and idea behind edge computing. We look at the advantages and disadvantages currently present in the technology. We further explore Fog Computing, a sub-part of edge computing in detail. Taking real-world examples, we predict the future of technology.
How does Himanshu Savargaonkar define Edge Computing in the talk?
AA paradigm in which substantial computing and storage resources are placed at the Internet's edge in close proximity to mobile devices or sensors.
BA model where all computation and storage is moved entirely off the cloud to tiny microcontrollers.
CA cloud-centric approach that centralizes processing in large datacenters to minimize edge complexity.
DA method that only uses GPUs and AI accelerators inside the cloud to run embedded workloads.
EA system that only stores sensor data at the edge but always sends it to the cloud for processing.
What was the primary practical reason Himanshu gave for performing image processing onboard a drone in the offshore search-and-rescue case study?
ATo provide very low latency and reduce dependence on unreliable long-distance communication.
BTo maximize use of cloud servers by streaming all images to shore for central processing.
CTo take advantage of near-unlimited compute power available on drones compared to the cloud.
DTo ensure all raw data is continuously transmitted so the base station can archive everything.
ETo decrease onboard computation in favor of manual review of images later.
According to the talk, what role does 'fog computing' play in the edge architecture?
AA distributed layer of many small compute nodes (microcontrollers/processors) near devices that can be combined (via mesh/swarm) to scale processing at the edge.
BA single powerful movable server that replaces cloud datacenters.
CA synonym for top-level cloud computing in the Edge architecture.
DA storage-only layer that prevents any computation from happening at the edge.
EA marketing term with no practical architecture implications.