Fyi @here

Sent from my Verizon Wireless 4G LTE smartphone


-------- Original message --------
From: Nic Lane <nicl...@acm.org>
Date: 08/26/2017 3:33 AM (GMT-08:00)
To: Nic Lane <nicl...@acm.org>
Subject: CFP: IEEE Computer Magazine -- Special Issue on Mobile and Embedded 
Deep Learning

Hi folks, please consider submitting to the upcoming IEEE Computer Magazine 
special issue on mobile and embedded forms of deep learning: 
https://www.computer.org/computer-magazine/2017/07/10/mobile-and-embedded-deep-learning-call-for-papers/

Below you will find the complete CFP. the submission deadline is Sept 15. 
Please mail us with any questions.

Best,

Nic Lane (UCL and Nokia Bell Labs) and Pete Warden (Google Brain)
Special Issue Guest Editors -- IEEE Computer Magazine

===

Call for Papers: IEEE Computer Magazine -- Special Issue on Mobile and Embedded 
Deep Learning

DEADLINE (EXTENDED): 15 September 2017
Publication date: April 2018 (with recommended early release on 
arxiv.org<http://arxiv.org/>)

https://www.computer.org/computer-magazine/2017/07/10/mobile-and-embedded-deep-learning-call-for-papers/

In recent years, breakthroughs from the field of deep learning have transformed 
how sensor data from cameras, microphones, and even accelerometers, LIDAR, and 
GPS can be analyzed to extract the high-level information needed by the 
increasingly commonplace examples of sensor-driven systems that range from 
smartphone apps and wearable devices to drones, robots, and autonomous cars.

Today, the state of the art in computational models that, for example, 
recognize a face in a crowd, translate one language into another, discriminate 
between a pedestrian and a stop sign, or monitor the physical activities of a 
user, are increasingly based on deep-learning principles and algorithms. 
Unfortunately, deep-learning models typically exert severe demands on local 
device resources, which typically limits their adoption in mobile and embedded 
platforms. As a result, in far too many cases, existing systems process sensor 
data with machine learning methods that were superseded by deep learning years 
ago.

Because the robustness and quality of sensory perception and reasoning is so 
critical to mobile and embedded computing, we must begin the careful work of 
addressing two core technical questions. First, to ensure that the 
sensor-inference problems that are central to this class of computing are 
adequately addressed, how should existing deep-learning techniques be applied 
and new forms of deep learning be developed? Meeting this challenge involves a 
combination of learning applications—some of which are familiar to other 
domains (such as in processing image and audio), as well as those more uniquely 
tied to wearable and mobile systems (such as activity recognition). Second, for 
the compute, memory, and energy overhead of current—and future—deep-learning 
innovations, what will be required to improve efficiency and effectively 
integrate into a variety of resource-constrained platforms? Solutions to such 
efficiency challenges will come from innovations in algorithms, systems 
software, and hardware (such as in ML-accelerators and changes to conventional 
processors).

In this special issue of Computer, the guest editors aim to consider these two 
broad themes, which drive further advances in mobile and embedded deep 
learning. More specific topics of interest include but are not limited to

= Compression of deep model architectures;
= Neural-based approaches for modeling user activities and behavior;
= Quantized and low-precision neural networks (including binary networks);
= Mobile vision supported by convolutional and deep networks;
= Optimizing commodity processors (GPUs, DSPs etc.) for deep models;
= Audio analysis and understanding through recurrent and deep architectures;
= Hardware accelerators for deep neural networks;
= Distributed deep model training approaches;
= Applications of deep neural networks with real-time requirements;
= Deep models of speech and dialog interaction or mobile devices; and
= Partitioned networks for improved cloud- and processor-offloading.

SUBMISSION DETAILS

Only submissions that describe previously unpublished, original, 
state-of-the-art research and that are not currently under review by a 
conference or journal will be considered.

There is a strict 6,000-word limit (figures and tables are equivalent to 300 
words each) for final manuscripts. Authors should be aware that Computer cannot 
accept or process papers that exceed this word limit.

Articles should be understandable by a broad audience of computer science and 
engineering professionals, avoiding a focus on theory, mathematics, jargon, and 
abstract concepts.

All manuscripts are subject to peer review on both technical merit and 
relevance to Computer’s readership. Accepted papers will be professionally 
edited for content and style. For accepted papers, authors will be required to 
provide electronic files for each figure according to the following guidelines: 
for graphs and charts, authors must submit them in their original editable 
source format (PDF, Visio, Excel, Word, PowerPoint, etc.); for screenshots or 
photographs, authors must submit high-resolution files (300 dpi or higher at 
the largest possible dimensions) in JPEG or TIFF formats.

Authors of accepted papers are encouraged to submit multimedia, such as a 2- to 
4-minute podcast, videos, or an audio or audio/video interview of the authors 
by an expert in the field, which Computer staff can help facilitate, record, 
and edit.

For author guidelines and information on how to submit a manuscript 
electronically, visit 
www.computer.org/web/peer-review/magazines<http://www.computer.org/web/peer-review/magazines>.
 For full paper submission, please visit 
mc.manuscriptcentral.com/com-cs<http://mc.manuscriptcentral.com/com-cs>.

QUESTIONS?

Please direct any correspondence before submission to the guest editors:

Nic Lane, University College London and Nokia Bell Labs 
(nic.l...@cs.ucl.ac.uk<mailto:nic.l...@cs.ucl.ac.uk>)

Pete Warden, Google Brain (petewar...@google.com<mailto:petewar...@google.com>)

Reply via email to