Jacob Chakareski is an Assistant Professor of Electrical and Computer Engineering at the University of Alabama. His long-term vision is enabling ubiquitous networked immersion and virtual human teleportation to any remote corner of the world to assist humanity with societal challenges (see Figure). He facilitates the emerging technologies of networked drones, virtual and augmented reality (VR/AR), and volumetric 360° video in pursuit of these objectives. He explores holistically the four major system aspects of such future networked VR/AR applications for immersive communication: capture, coding, networking, and reconstruction/user-interaction. His investigations include techniques from communications and networking, signal processing, and reinforcement learning. His interests include 5G IoT architectures and edge computing in small-cell networks. He eagerly pursues ultrasonic applications in telemedicine, remote sensing, and biomedicine, and the integration of multi-view imaging into cyber-physical health care devices and systems (when he can), and is passionate about bridging science and technology via entrepreneurial activity.

On the personal side, Chakareski has broad interests spanning anthropology, history, foreign languages, travelling, healthy living, classical music, and sports in general.

Jul. 2017: IEEE INFOCOM 2018 paper on caching in small cell networks submitted.
Jul. 2017: IEEE Globecom 2017 paper on coalition formation for task/resource assignment in drone networks submitted.
Jul. 2017: IEEE MMSP 2017 paper on convexity analysis of synthesis distortion in multi-view imaging accepted.
Jul. 2017: ACM MM 2017 paper on optimal set of 360° video representations for viewport-adaptive streaming accepted.
Jun. 2017: Invited talk on drone networks and virtual and augmented reality at Intel Research Santa Clara.
Jun. 2017: ACM MobiSys 2017 paper on drone networks for virtual human teleportation presented.
Jun. 2017: ACM TMM paper on fine-grained scalable video caching and networking submitted.
May. 2017: IEEE ICC 2017 paper on viewport-adaptive 360° video streaming presented. (best paper award)
May. 2017: Attending the SCIEN Workshop on Augmented and Mixed Reality at Stanford.
May. 2017: IEEE INFOCOM 2017 paper on UAV-IoT sensing for networked virtual and augmented reality presented.
Prior news
Recent results:
  • IEEE Globecom 2017: "A Coalition Formation Approach to Coordinated Task Allocation in Heterogeneous UAV Networks" (in review).
  • ACM MM 2017: "Optimal Set of 360-Degree Videos for Viewport-Adaptive Streaming" (to appear).
  • IEEE MMSP 2017: "Convexity characterization of Virtual View Reconstruction Error in Multi-View Imaging" (to appear).
  • ACM TMM 2017: "Fine-grained Scalable Video Caching and Networking" (in review).
  • ACM SIGCOMM 2017: "VR/AR Immersive Communication: Caching, Edge Computing, and Transmission Trade-Offs" (to appear).
  • ACM MobiSys 2017: "Drone Networks for Virtual Human Teleportation" (to appear).
  • IEEE ICC 2017: "Viewport-adaptive navigable 360-degree video delivery" (best paper award).
  • IEEE INFOCOM 2017: "Aerial UAV-IoT Sensing for Ubiquitous Immersive Communication and Virtual Human Teleportation".
Major awards & distinctions:
Media coverage:
Recent service:
  • TPC member: Interconnected Virtual Reality Workshop @ IEEE Globecom 2017
  • Organizer: SS Immersive VR/AR Experiences @ ACM MMSys 2017
  • Guest editor: IEEE TCSVT SI Mobile Visual Cloud (January 2017)
  • TP chair: IEEE Packet Video AR Workshop 2016
  • Demo/Expo chair: IEEE ICME 2016
Tech involvement:
  • Frame: The network is your computer. [The future is here: Personalized cloud apps for everyone.]
  • Vidyocast: Scalable (SVC) video IP broadcast. [An industry's first.]