Designing usable mobile interfaces for spatial data
Abstract
This
dissertation
deals
mainly
with
the
discipline
of
Human-‐Computer
Interaction
(HCI),
with
particular
attention
on
the
role
that
it
plays
in
the
domain
of
modern
mobile
devices.
Mobile
devices
today
offer
a
crucial
support
to
a
plethora
of
daily
activities
for
nearly
everyone.
Ranging
from
checking
business
mails
while
traveling,
to
accessing
social
networks
while
in
a
mall,
to
carrying
out
business
transactions
while
out
of
office,
to
using
all
kinds
of
online
public
services,
mobile
devices
play
the
important
role
to
connect
people
while
physically
apart.
Modern
mobile
interfaces
are
therefore
expected
to
improve
the
user's
interaction
experience
with
the
surrounding
environment
and
offer
different
adaptive
views
of
the
real
world.
The
goal
of
this
thesis
is
to
enhance
the
usability
of
mobile
interfaces
for
spatial
data.
Spatial
data
are
particular
data
in
which
the
spatial
component
plays
an
important
role
in
clarifying
the
meaning
of
the
data
themselves.
Nowadays,
this
kind
of
data
is
totally
widespread
in
mobile
applications.
Spatial
data
are
present
in
games,
map
applications,
mobile
community
applications
and
office
automations.
In
order
to
enhance
the
usability
of
spatial
data
interfaces,
my
research
investigates
on
two
major
issues:
1. Enhancing
the
visualization
of
spatial
data
on
small
screens
2. Enhancing
the
text-‐input
methods
I
selected
the
Design Science Research approach
to
investigate
the
above
research
questions.
The
idea
underling
this
approach
is
“you
build artifact to learn from it”, in
other
words
researchers
clarify
what
is
new
in
their
design.
The
new
knowledge
carried
out
from
the
artifact
will
be
presented
in
form
of
interaction
design
patterns
in
order
to
support
developers
in
dealing
with
issues
of
mobile
interfaces.
The
thesis
is
organized
as
follows.
Initially
I
present
the
broader
context,
the
research
questions
and
the
approaches
I
used
to
investigate
them.
Then
the
results
are
split
into
two
main
parts.
In
the
first
part
I
present
the
visualization
technique
called
Framy.
The
technique
is
designed
to
support
users
in
visualizing
geographical
data
on
mobile
map
applications.
I
also
introduce
a
multimodal
extension
of
Framy
obtained
by
adding
sounds
and
vibrations.
After
that
I
present
the
process
that
turned
the
multimodal
interface
into
a
means
to
allow
visually
impaired
users
to
interact
with
Framy.
Some
projects
involving
the
design
principles
of
Framy
are
shown
in
order
to
demonstrate
the
adaptability
of
the
technique
in
different
contexts.
The
second
part
concerns
the
issue
related
to
text-‐input
methods.
In
particular
I
focus
on
the
work
done
in
the
area
of
virtual
keyboards
for
mobile
devices.
A
new
kind
of
virtual
keyboard
called
TaS
provides
users
with
an
input
system
more
efficient
and
effective
than
the
traditional
QWERTY
keyboard.
Finally,
in
the
last
chapter,
the
knowledge
acquired
is
formalized
in
form
of
interaction
design
patterns. [edited by author]