close

Вход

Забыли?

вход по аккаунту

?

DESCRIPTION JP2016188959

код для вставкиСкачать
Patent Translate
Powered by EPO and Google
Notice
This translation is machine-generated. It cannot be guaranteed that it is intelligible, accurate,
complete, reliable or fit for specific purposes. Critical decisions, such as commercially relevant or
financial decisions, should not be based on machine-translation output.
DESCRIPTION JP2016188959
Abstract: [Problem] To provide an output device capable of alerting a person moving while
listening to music, for example, to approach a station, and also to increase interest in the music
itself to be reproduced. SOLUTION: Information indicating a moving path including at least two
stations present in a user's moving path and a track connecting the stations, and information
indicating a direction of movement along the moving path are acquired At the same time, the
music data associated with the station or station is emitted from the speaker 2 based on the
acquired information. At this time, in response to the departure from the station or the approach
to the station, the reproduction mode of the outputted music is changed to the reproduction
mode corresponding to the departure or approach and reproduced. [Selected figure] Figure 3
OUTPUT DEVICE, OUTPUT METHOD, AND PROGRAM FOR OUTPUT DEVICE
[0001]
The present application belongs to the technical field of an output device, an output method, and
a program for an output device. More specifically, the present invention belongs to the technical
field of an output device and an output method for outputting content information, and a
program for the output device.
[0002]
In recent years, so-called smartphones and portable audio players have become widespread, and
11-04-2019
1
along with this, listening to music, for example, when traveling by train is also widely generalized.
As a prior art corresponding to such a tendency, there is, for example, a technology disclosed in
Patent Document 1 below. In this technology, by associating position information related to
movement with reproduction content such as music, the reproduction content corresponding to a
place corresponding to the position information is reproduced.
[0003]
JP, 2009-192750, A
[0004]
However, in the technology disclosed in Patent Document 1 below, the information serving as the
basis for controlling the reproduction of music is only position information, and no consideration
is given, for example, to the timing at which announcements in trains and the like are broadcast.
[0005]
For this reason, for example, with regard to the reproduction of music while traveling by train, it
is possible to miss an announcement in a car accompanying arrival at a station, a announcement
on a premises broadcasted at a station, etc. There is a problem that there is a risk that an event
such as failure to occur may occur.
[0006]
Therefore, the present application has been made in view of the above problems, and an example
of the problem is that, for example, it is possible to alert a person moving while listening to
music, and the music itself to be reproduced It is an object of the present invention to provide an
output device and an output method that can increase the interest in the method, and a program
for the output device.
[0007]
In order to solve the above problems, the invention according to claim 1 is point information
indicating at least two points existing in a user's moving path and point-to-point road information
indicating a point-to-point path connecting the points. And acquisition means for acquiring
traveling path information including the direction information indicating the direction of
movement of the user along the traveling path, and at least one of the point, the point between
the points, or the direction in advance The output means for outputting the associated content
information in response to the movement of the user, and the content information being output
in response to the user leaving the point or the user approaching the point Control means for
11-04-2019
2
performing output control to control the output means so as to change and output the output
mode.
[0008]
In order to solve the above-mentioned problems, the invention according to claim 9 is
characterized in that at least two points existing in the moving path of the user, a path between
points connecting the respective points, or the user of the user along the moving path Point
information indicating each of the points in an output method executed in an output device
including output means for outputting content information associated in advance with at least
one of the direction of movement according to the movement of the user; An acquisition step of
acquiring travel route information including inter-point route information indicating the interpoint route, direction information indicating the direction of the movement, and outputting the
content information from the output unit; The output means is configured to change and output
the output mode of the content information that has been output according to the user leaving
the point or the user approaching the point Comprising a control step of controlling, the.
[0009]
In order to solve the above problems, the invention according to claim 10 is characterized in that
at least two points existing in the moving path of the user, a path between points connecting the
respective points, or the user of the user along the moving path 9. A computer included in an
output device comprising output means for outputting content information pre-associated with at
least one of the direction of movement according to the movement of the user, according to any
one of claims 1 to 8, It functions as the output device described in the section.
[0010]
It is a block diagram showing a schematic structure of an output device concerning an
embodiment.
FIG. 1 is a block diagram showing a schematic configuration of a reproduction system according
to an embodiment.
It is a block diagram showing the outline composition of the reproduction apparatus etc.
contained in the reproduction system concerning an example, (a) is a block showing the outline
composition of the server apparatus contained in the reproduction system concerned, (b) is the
11-04-2019
3
reproduction apparatus concerned It is a block diagram which shows the outline | summary
structure of.
It is a figure which illustrates the contents of the route database concerning an example.
It is a figure which illustrates the contents of the association table concerning an example.
It is a flowchart which shows the reproduction | regeneration processing which concerns on an
Example.
It is a figure which illustrates the reproduction | regeneration aspect of the music which concerns
on an Example.
BRIEF DESCRIPTION OF THE DRAWINGS It is a figure which illustrates the display mode of the
image which concerns on an Example, (a) is a figure which illustrates the display mode of the
image in a station, (b) is a figure which illustrates the display mode of the image during
movement between stations. is there. It is a block diagram which shows schematic structure of
the reproducing | regenerating apparatus which concerns on a 1st modification.
[0011]
Next, a mode for carrying out the present application will be described with reference to FIG. FIG.
1 is a block diagram showing a schematic configuration of an output device according to the
embodiment.
[0012]
As shown in FIG. 1, the output device S according to the embodiment includes an acquisition unit
1, an output unit 2, and a control unit 3.
[0013]
In this configuration, the acquisition unit 1 includes travel path information including point
information indicating at least two points existing in the user's travel path, and inter-point path
11-04-2019
4
information indicating an inter-point path connecting the points. Acquiring direction information
indicating a direction of movement of the user along the movement path.
[0014]
Then, the output means 2 outputs the content information pre-associated with at least one of the
point, the route between the points and the direction according to the movement of the user.
[0015]
At this time, the control means 3 changes the output mode of the content information being
output in response to the user leaving the point or the user approaching the point, and outputs
the output means 2. Control.
[0016]
As described above, according to the operation of the output device S according to the
embodiment, the travel path information and the direction information are acquired, and the
content information associated with at least one of the point, the travel path and the travel
direction is Output according to movement.
Then, in accordance with the departure from the point or the approach to the point, control is
performed so as to change and output the output mode of the content information that has been
output.
Therefore, when the content information associated with the point or the like is output according
to the movement of the user, the output mode is changed according to the departure or approach
to the point, so the user is warned of the departure or approach. As well as being able to evoke,
interest in the content information itself may also increase.
[0017]
Next, specific examples corresponding to the above-described embodiment will be described
using FIGS. 2 to 8.
11-04-2019
5
The embodiment described below is an embodiment in the case where the embodiment is applied
to the control of the reproduction mode of the music in the reproduction apparatus which is
carried by the user and moves.
At this time, the music is, for example, music distributed to the playback device via a network
such as the Internet. Further, as a specific example of the reproduction device, for example, a
smartphone, a music player, or the like corresponds. Further, in the playback apparatus
according to the embodiment, a plurality of playback apparatuses having the same configuration
can be connected to the server apparatus serving as the distribution source of the music via the
network.
[0018]
Furthermore, FIG. 2 is a block diagram showing a schematic configuration of the reproduction
system according to the embodiment, FIG. 3 is a block diagram showing a schematic
configuration of a reproduction apparatus etc. included in the reproduction system, and FIG. It is
a figure which illustrates the contents of a route database. Furthermore, FIG. 5 is a diagram
illustrating the contents of the association table according to the embodiment, FIG. 6 is a
flowchart showing the reproduction process according to the embodiment, and FIG. 7 illustrates
the reproduction mode of music according to the embodiment. It is a figure, FIG. 8 is a figure
which illustrates the display mode of the image based on an Example. At this time, in FIG. 3, the
same component numbers as the constituent members in the output device S are used for the
constituent members of the embodiment corresponding to the constituent members in the output
device S according to the embodiment shown in FIG. .
[0019]
As shown in FIG. 2, the reproducing apparatus ST1, the reproducing apparatus ST2,..., The
reproducing apparatus STn (n is a natural number) according to an embodiment having the same
configuration as each other are connected to the server apparatus SV via the network NW. It is
possible to exchange various data with the server device SV. At this time, a reproduction system
SS according to the embodiment is configured by the reproduction devices ST1 to STn, the
network NW, and the server device SV. In the following description, when the matters common to
the playback devices ST1 to STn are described, these will be collectively referred to simply as
“the playback device ST”.
11-04-2019
6
[0020]
As shown in FIG. 3A, the server apparatus SV according to the embodiment includes an interface
20, a processing unit 21 including a CPU, a random access memory (RAM) and a read only
memory (ROM), and a hard disk drive (HDD). The recording unit 22 includes a disc drive), an SSD
(solid state drive) or the like, the operation unit 23 including a keyboard or a mouse, and the
display 24 including a liquid crystal display or the like.
[0021]
At this time, the association table T, the route database RDB, the timetable database TDB, the
music database MDB and the music related database VDB according to the embodiment are
recorded in the recording unit 22 in a non-volatile manner.
[0022]
Here, the route database RDB is, for example, a database in which route data corresponding to
railway data and the like in "National Land Numerical Information" issued by the Ministry of
Land, Infrastructure, Transport and Tourism is recorded. More specifically, as illustrated in FIG.
The shape and position of the track of each railway line, the position of the station, and the like
are recorded as separate data of the ups and downs.
At this time, in Fig. 4, the shape and position of each track is a thin solid line in the range
centered on Shinjuku Station, and each station along the track (the name of each station is shown
by () in Fig. 4) Positions and shapes are indicated by thick solid lines, respectively.
As such a route database RDB, route name data indicating the route name, for example, latitude /
longitude data indicating the position and shape of these lines, and latitude / longitude data
indicating the position and shape of the home of each station indicate the respective route names
and It is described in association with station name data indicating the station name.
[0023]
On the other hand, the timetable database TDB indicates train identification data for identifying a
11-04-2019
7
train traveling on each route, a diesel car, etc., and a time when the train or diesel car arrives at
the station and a time when it departs from the station. It is a database in which time data is
associated and recorded. In the following description, the above-mentioned train or diesel vehicle
is simply referred to as "train or the like".
[0024]
Next, in the music database MDB, music data corresponding to music to be transmitted to each
reproduction device ST and reproduced by the reproduction device ST is recorded in association
with the music identification data for identifying the respective music There is.
[0025]
Further, in the music related database VDB, for each music whose music data is recorded in the
music database MDB, image data corresponding to an image related to the music is recorded in
association with the music identification data. ing.
The image related to the music in this case includes, for example, an image of a jacket used when
the music is put on sale, and an image of a distant view of a place deeply related to the image.
The respective image data are mutually distinguishable by the image identification data.
[0026]
Finally, as illustrated in FIG. 5, the association table T includes the song identification data, the
station name data, the image identification data, the route name data recorded in the route
database RDB, and the route Direction data indicating upward or downward on the route
indicated by the name data is described in association with each other. At this time, in the
reproduction processing according to the embodiment described later, the music corresponding
to the music data identified by the music identification data associated with the station name
data is associated with the association table T along with the movement of a train or the like.
When the reproducing device ST has arrived at the station indicated by the station name data
associated with the music identification data, the music is to be reproduced by the reproducing
device ST. Similarly, the image identified by the image identification data associated with the
station name data arrives at the station indicated by the station name data associated with the
image identification data in the association table T by the reproduction device ST. It is an image
to be displayed on the reproduction device ST when the image is being played. More specifically,
11-04-2019
8
in the case illustrated in FIG. 5, for example, when a train or the like arrives at a station identified
by the station name data “A”, the playback apparatus ST carried by the user riding on the train
or the like It is described that the music identified by the music identification data “1” is
reproduced, and the image identified by the image identification data “I” is displayed.
[0027]
In the following description, for simplification of the description, for example, a station identified
by "A" station name data is referred to as "station A", and a song identified by "1" music
identification data is "song", for example. For example, the image identified by the image
identification data “I” is referred to as “image I”.
[0028]
On the other hand, in the association table T illustrated in FIG. 5, for music and images not
associated with the station name data, for example, the reproduction of the song 2 and the
display of the image II are performed between the station A and the station B. doing.
Similarly, for example, the reproduction of the song 6 and the display of the image VI mean that
it is performed between the station C and the next station.
[0029]
Here, in the association table T according to the embodiment illustrated in FIG. 5, it is described
that reproduction of different music pieces and display of an image are performed for one route
in the up and down directions. More specifically, for example, in the case of the station B,
reproduction of the song 3 and display of the image III are performed at the station B in the
reproduction device ST moving with the train in the upward direction, but moves with the train in
the downward direction In the reproduction apparatus ST, reproduction of the tune 13 and
display of the image XIII are performed at the station B. Similarly, speaking between the station B
and the station C, in the playback device ST moving together with a train in the upward direction,
playback of the song 4 and display of the image IV are performed between the station B and the
station C. In the reproduction apparatus ST which moves with a train in the downward direction,
reproduction of the music 14 and display of the image XIV are performed between the station B
and the station C.
11-04-2019
9
[0030]
The reproduction process according to the embodiment using the association table T will be
described in detail later.
[0031]
Next, under the control of the processing unit 21, the interface 20 controls the exchange of, for
example, each data recorded in the recording unit 22 with each reproducing apparatus ST via the
network NW.
When the operation unit 23 performs an operation or the like for causing the server apparatus
SV to perform the reproduction process according to the embodiment, the operation unit 23
generates an operation signal corresponding to the operation or the like to the processing unit
21. Output.
[0032]
As a result, the processing unit 21 carries out while exchanging data recorded in the recording
unit 22 with any one of the reproduction devices ST via the interface 20 while performing the
transfer based on the operation signal from the operation unit 23. It centrally controls the
operation as the server apparatus SV in the reproduction processing according to the example.
Note that information etc. to be presented to the user of the server apparatus SV (for example,
the manager of the reproduction system SS according to the embodiment) in the operation is
displayed on the display 24 and provided for the presentation.
[0033]
Next, as shown in FIG. 3B, the playback devices ST according to the embodiment have the same
configuration as one another, and more specifically, a processing unit 10 comprising a CPU, a
RAM, a ROM, etc., and an HDD Alternatively, the recording unit 11 including an SSD or the like,
the interface 12, the sensor 13, the operation unit 14 including an operation button and a touch
panel, and the speaker 21 corresponding to an example of the output unit 2 according to the
embodiment And a display 2 2 corresponding to one example of the liquid crystal display or the
like.
11-04-2019
10
[0034]
The processing unit 10 is an internal interface 1 corresponding to an example of the acquisition
unit 1 according to the embodiment, and a reproduction control unit corresponding to an
example of the output unit 2 according to the embodiment and driving the speaker 2 1 to emit
sound. 3 1 and a display control unit 3 2 corresponding to an example of the output means 2
according to the embodiment and displaying a necessary image on the display 2 2.
At this time, the internal interface 10, the reproduction control unit 3 1 and the display control
unit 3 2 may be realized by a hardware logic circuit such as a CPU constituting the processing
unit 10, or the reproduction process according to an embodiment described later. The program
may be realized as software by the CPU or the like reading and executing a program
corresponding to. In this case, the program stored in advance in the storage unit 11 may be read
by the CPU or the like, or the program acquired via the network MW may be executed by the
CPU or the like. Further, as shown by a broken line in FIG. 3B, an example corresponding to the
output device S according to the embodiment by the internal interface 1, the speaker 2 1, the
display 2 2, the reproduction control unit 3 1 and the display control unit 3 2 Are configured.
The sensor 13 is an example of the "time information acquisition means" according to the
present application, an example of the "detection means", an example of the "related information
acquisition means", an example of the "acceleration information acquisition means", the
"disembarking station information acquisition means" This corresponds to one example of the
above and one example of the “falling station broadcast detection means”.
[0035]
In this configuration, under the control of the processing unit 10, the interface 12 controls
exchange of each data with the server apparatus SV via the network NW. Then, when an
operation or the like for causing the terminal device ST to perform the reproduction process
according to the embodiment is performed in the operation unit 14, the operation unit 14
generates an operation signal corresponding to the operation or the like to the processing unit
10. Output.
[0036]
11-04-2019
11
The sensor 13 detects the current position of the playback device ST by receiving navigation
radio waves from, for example, a so-called GPS (Global Positioning System) navigation satellite,
and outputs current position data indicating the current position to the processing unit 10. In
addition to this, the sensor 13 receives identification data transmitted by a transmitting device
(not shown) provided in a train or the like on which a user carrying the reproduction apparatus
ST gets on or in a station yard where the user gets on / off. Thus, movement position data
indicating a train or the like on which the user is riding, or a station is generated and output to
the processing unit 10.
[0037]
Thus, the processing unit 10 receives the current position data and the movement position data
through the internal interface 1, specifies the current position of the user carrying the
reproduction apparatus ST, and obtains an on-board train or the like. Identify After that, the
processing unit 10 reproduces the current position data indicating the identified current position,
the movement position data indicating the identified train or the like or the station, and the laterdescribed direction data when the reproduction device ST is moving with the train or the like. It
transmits to the server SV through the internal interface 1 and the interface 12 together with the
device identification data for identifying the device ST itself. The transmission of the position
data to the server apparatus SV in this case corresponds to a music inquiry process that inquires
the current position indicated by the position data or a train or the like or a music piece to be
reproduced at a station. Thereafter, reproduction processing according to the embodiment
described later corresponding to the above-mentioned inquiry processing from the server
apparatus SV corresponds to music data corresponding to music to be reproduced near the
current position of the user and an image to be displayed at the current position. When the
image data is transmitted, the processing unit 10 receives the music data and the image data via
the interface 12 and the internal interface 1, and temporarily stores them in the recording unit
11 in a non-volatile manner.
[0038]
Then, the reproduction control unit 3 1 of the processing unit 10 reads the recorded music data
from the recording unit 11 as reproduction processing according to the embodiment, and emits
the music corresponding to the read music data through the speaker 2 1. Do. Furthermore, the
display control unit 3 2 of the processing unit 10 reads the recorded image data from the
recording unit 11 and displays an image corresponding to the read image data through the
display 2 2 as reproduction processing according to the embodiment. .
11-04-2019
12
[0039]
In the recording unit 11, for example, map data for guiding processing for guiding the user in the
reproducing apparatus ST, a program for performing so-called map matching processing for the
guiding, and the like are recorded in a non-volatile manner. . The map data also includes route
data and the like corresponding to the route name data and the like recorded in the route
database RDB. Using these, the processing unit 10 performs the necessary guidance processing
and the like in addition to the reproduction processing according to the embodiment.
[0040]
Next, reproduction processing according to an embodiment centering on the processing unit 10
of the terminal device ST will be specifically described with reference to FIGS. 3 to 8. The
flowchart shown in FIG. 6 is a flowchart showing the reproduction process according to the
embodiment, which is executed by the reproduction apparatus ST according to the embodiment.
[0041]
As shown in the corresponding flowchart in FIG. 6, the reproduction process according to the
embodiment is started, for example, when an operation to start the reproduction process is
performed in the operation unit 14 of the reproduction apparatus ST. When the reproduction
process according to the embodiment is started, the processing unit 10 of the reproduction
apparatus ST first uses the current position data and the movement position data from the sensor
13 to position the reproduction apparatus ST on the map data. Map matching processing
specified in step S1 is performed (step S1). By this map matching process, the current position of
the playback device ST and the position on the route corresponding to the current position or the
position of the station are specified. More specifically, when the reproduction device ST is
moving with a train or the like, in the map matching process of step S1, the position on the route
corresponding to the current position is specified. On the other hand, when the reproduction
device ST is not moved (that is, the user carrying the reproduction device ST is at the station), in
the map matching process of step S1, the position of the station corresponding to the current
position is specified.
11-04-2019
13
[0042]
Next, based on the identification result in step S1, the processing unit 10 determines whether the
user carrying the reproduction apparatus ST is traveling on the track between the station and the
next station on a train or the like. It determines (step S2). In the following description, a section
between a station and the next station is simply referred to as "between stations". If it is
determined in step S2 that the user carrying the playback apparatus ST is traveling between
stations on a train or the like (i.e., moving) (step S2; YES), the processing unit 10 The moving
direction of the train or the like, that is, whether the train or the like is moving up or down on the
route is further determined based on the determination result of step S2 (step S3). ). After that,
the processing unit 10 performs the above-mentioned song inquiry processing for the music
associated with the current position, etc., based on the current position of the playback device ST
identified at step S1 and the position on the route or the position of the station S4). At this time,
the processing unit 10 uses the device identification data for the current position data and the
movement position data respectively indicating the current position and the train etc. identified
in the step S1 and the direction data indicating the movement direction determined in the step
S3. Together with the server SV.
[0043]
Then, the processing unit 21 of the server apparatus SV that has received the current position
data, movement position data, direction data, and device identification data refers to the
association table T, and route name data and the direction data corresponding to the movement
position data. Music identification data and image identification data associated with Here, in the
following description, when referring to music identification data and image identification data, it
is simply referred to as "music identification data etc." Since the user carrying the playback
apparatus ST is currently on a train traveling between stations (see above step S2; YES), the
processing unit 21 determines the music identification data associated with the station name
data. Other music identification data other than etc. is specified. More specifically, for example,
when the user carrying the reproduction device ST is on a train traveling between the station A
and the station B (step S2; YES, see step S3), the processing unit 21 The song identification data
indicating the song 2 in the association table T illustrated in FIG. 5 and the image identification
data indicating the image II are specified.
[0044]
11-04-2019
14
Thereafter, the processing unit 21 acquires, from the music database MDB and the music related
database VDB, music data and image data respectively corresponding to the music and images
identified by the specified music identification data and the like. Then, the processing unit 21
transmits the acquired music data and image data to the reproduction device ST that has
transmitted the device identification data via the interface 20 and the network NW.
[0045]
On the other hand, the processing unit 10 of the playback device ST that has executed step S4
next transmits the music data and the music data transmitted from the server device SV in
correspondence with the current position data, movement position data and direction data
transmitted in step S4. The image data is obtained through the interface 12 and temporarily
recorded, for example, in the recording unit 11 (steps S5 and S6). Thereafter, the processing unit
10 performs inter-station music reproduction and image display according to the embodiment
using the acquired music data and image data (step S7). That is, in the reproducing apparatus ST
moving between stations, the reproduction control unit 3 1 of the processing unit 10 reproduces
and emits the music corresponding to the music data using the music data acquired in step S5,
and The display control unit 3 2 of the processing unit 10 displays an image corresponding to
the image data using the image data acquired in step S6 (step S7). At this time, the reproduction
control unit 3 1 reproduces the music data, for example, at the reproduction volume designated
by the operation on the operation unit 14. Further, the display mode of the image by the display
control unit 3 2 will be collectively described later.
[0046]
On the other hand, during the reproduction according to step S7, the processing unit 10 trains
etc. in which the user of the reproduction apparatus ST is on board based on the current position
data or the movement position data output from the sensor 13 (between stations It is determined
whether or not moving (see step S2; YES)) has approached the next station (step S8). At this time,
the processing unit 10 determines, for example, whether or not the train has reached the next
station on the basis of whether or not the train has reached 50 meters in front of the next station.
When it is not approaching the next station in the determination of step S8 (step S8; NO), the
reproduction control unit 3 1 and the display control unit 3 2 of the processing unit 10
respectively return to the above-mentioned step S7, Continue playback and display of images. On
the other hand, when the next station is approached in the determination of step S8 (step S8;
YES), the reproduction control unit 3 1 and the display control unit 3 2 of the processing unit 10
each start the reproduction process associated with the approach (step S9). More specifically, the
11-04-2019
15
reproduction control unit 3 1 gradually reduces the reproduction sound volume so that the
reproduction sound volume of the currently reproduced music becomes zero at the time of
arrival at the next station. Further, the reproduction processing as step S9 by the display control
unit 3 2 will be collectively described later.
[0047]
After that, the processing unit 10 determines whether or not to end the reproduction processing
according to the embodiment, for example, by turning off the power switch (not shown) of the
reproduction device ST (step S10). When the reproduction process is ended in the determination
of step S10 (step S10; YES), the processing unit 10 ends the reproduction process according to
the embodiment as it is. On the other hand, in the determination of step S10, when the
reproduction process according to the embodiment is continued (step S10; NO), the processing
unit 10 corresponds to the music to be reproduced at the station which has come close (step S8;
YES). In order to obtain the data and the image data corresponding to the image to be displayed
at the station, respectively, the map matching processing similar to the step S1 is performed for
the current position of the reproduction device ST (step S11). By this map matching process, the
current position of the playback device ST, the train where the user of the playback device ST is
riding, etc. approaches the position of the station (see step S8; YES) as in step S1. It is identified.
[0048]
Next, the processing unit 10 determines the moving direction of a train or the like on which the
user carrying the reproduction apparatus ST gets on, based on the identification result in step
S11, as in step S3 (step S12). After that, the processing unit 10 performs the processing based on
the current position of the playback device ST identified in step S11 and the position of the next
station, and the moving direction determined in step S12 (that is, the approaching direction to
the next station). The above-described song inquiry process is performed on the song associated
with the next station (step S13). At this time, the processing unit 10 uses the above device
identification data as the current position data and the movement position data respectively
indicating the current position and the train etc. identified in the step S11, and the direction data
indicating the movement direction determined in the step S12. Together with the server SV.
[0049]
11-04-2019
16
Then, the processing unit 21 of the server apparatus SV that has received the current position
data, movement position data, direction data, and device identification data refers to the
association table T, and routes name data and the direction data corresponding to the position
data. Identify the associated music identification data, etc. And now, since the user carrying the
playback device ST is on a train etc. arriving at the next station (see above step S8; YES), the
processing unit 21 uses the station name data of the next station. Identify the associated music
identification data, etc. More specifically, for example, when the user carrying the reproduction
apparatus ST is on a train or the like approaching the station C in the upward direction (step S8;
YES, see step S12), the processing unit 21 is shown in FIG. The music identification data
indicating the music 5 in the association table T illustrated in 5 and the image identification data
indicating the image V are specified.
[0050]
Thereafter, the processing unit 21 acquires, from the music database MDB and the music related
database VDB, music data and image data respectively corresponding to the music and images
identified by the specified music identification data and the like. Then, the processing unit 21
transmits the acquired music data and image data to the reproduction device ST that has
transmitted the device identification data via the interface 20 and the network NW.
[0051]
On the other hand, the processing unit 10 of the playback device ST that has executed step S13
next transmits the music data and the music data transmitted from the server device SV in
correspondence with the current position data, movement position data and direction data
transmitted in step S13. The image data is obtained through the interface 12 and temporarily
recorded, for example, in the recording unit 11 (steps S14 and S15). Thereafter, the processing
unit 10 performs in-station music reproduction and image display according to the embodiment
using the acquired music data and image data (step S16). That is, in the reproducing apparatus
ST approaching to the next station, the reproduction control unit 3 1 starts reproduction and
sound emission of the music corresponding to the music data using the music data acquired in
the step S14, and the step S15. The display control unit 3 2 starts displaying an image
corresponding to the image data using the image data acquired by the (step S16). At this time,
the reproduction control unit 3 1 starts reproduction of the music associated with the next
station at the same timing as the start of the close reproduction processing at step S9 and
gradually increases the reproduction volume. The display mode of the image as step S16 by the
display control unit 3 2 will be collectively described later. During the reproduction / display of
11-04-2019
17
step S16, the reproduction of the music associated with the next station according to step S16 is
performed even after the train or the like carried by the user carrying the reproduction
apparatus ST arrives at the next station. And the display of the image is continued.
[0052]
On the other hand, during the reproduction according to step S16 after arrival at the next station,
the processing unit 10 gets the user of the reproduction apparatus ST on the basis of the current
position data or movement position data output from the sensor 13. It is determined whether or
not an existing train or the like has left the next station (i.e., whether it has departed from the
station) (step S17). At this time, based on the current position data or movement position data
output from the sensor 13, the processing unit 10 departs from the next station by determining
whether the train or the like that has been stopped has started to move or not. It is determined
whether or not. When the vehicle is still stopped at the next station in the determination of step
S17 (step S17; NO), the reproduction control unit 3 1 and the display control unit 3 2
respectively return to the step S16 to reproduce the music and image until then. Continue to
display On the other hand, if it is determined in step S17 that the train departs from the next
station (step S17; YES), the regeneration control unit 3 1 and the display control unit 3 2 each
start regeneration processing associated with the departure (step S18). More specifically, the
reproduction control unit 3 1 gradually reduces the reproduction volume of the currently
reproduced music from the timing of departure from the next station to make it finally zero.
Further, the reproduction processing as step S18 by the display control unit 3 2 will be
collectively described later.
[0053]
After that, the processing unit 10 determines whether or not to end the reproduction processing
according to the embodiment, for example, by turning off the power switch (step S19). When the
reproduction process is ended in the determination of step S19 (step S19; YES), the processing
unit 10 ends the reproduction process according to the embodiment as it is. On the other hand,
when the reproduction process according to the embodiment is continued in the determination of
step S19 (step S19; NO), the processing unit 10 corresponds to the music data corresponding to
the music to be reproduced between the stations after the next station. And, in order to obtain
and reproduce or display each of the image data corresponding to the image to be displayed
between the stations, the map matching processing similar to the step S1 is performed for the
current position of the reproduction device ST (step S20). After that, the processing of steps S3 to
S12 is repeated as a playback device ST that moves with a train or the like moving between
11-04-2019
18
stations after the departure of the next station.
[0054]
In the reproduction apparatus ST carried by the user who travels between the stations and gets
on the train or the like arriving at the next station according to the processing of the above steps
S3 to S20, the station between the stations before arriving at the next station A so-called crossfading process is performed using each of the music and images associated with the music and
the music and images associated with the next station. Similarly, when departing from the next
station, each of the song and image associated with the next station and the song and image
associated between the stations after the next station are used. Cross fade processing is also
performed. These cross fade processes will be described together later.
[0055]
On the other hand, when it is determined that the user carrying the playback device ST is not
traveling between stations on a train or the like, that is, the train or the like is stopping at any
one of the stations in the determination of step S2 ( Step S2: NO) Next, the processing unit 10
executes the above-described steps S13 to S20 for the station at which the vehicle is at a stop.
That is, based on the current position of the playback apparatus ST identified in step S1 and the
position of the station, the processing unit 10 first performs the song inquiry process on the
music associated with the stopped station (step S13). .
[0056]
Then, the processing unit 21 of the server apparatus SV that has received the current position
data, movement position data, direction data, and device identification data refers to the
association table T, and route name data and the direction data corresponding to the movement
position data. The music identification data etc. which are linked | related with are identified.
Since the user carrying the playback apparatus ST has arrived at the above station (see above
step S2; NO), the processing unit 21 specifies the music identification data etc. associated with
the station name data of the station. Do. More specifically, for example, when the user carrying
the playback apparatus ST arrives at the station B in the upward direction (see the above step S2;
NO), the processing unit 21 selects a song in the association table T illustrated in FIG. The music
identification data indicating 3 and the image identification data indicating the image III are
11-04-2019
19
specified.
[0057]
Thereafter, the processing unit 21 acquires, from the music database MDB and the music related
database VDB, music data and image data respectively corresponding to the music and images
identified by the specified music identification data and the like. Then, the processing unit 21
transmits the acquired music data and image data to the reproduction device ST that has
transmitted the device identification data via the interface 20 and the network NW.
[0058]
On the other hand, the processing unit 10 of the playback device ST that has executed step S13
next performs the music data and the image data transmitted from the server apparatus SV in
response to the position data and the direction data transmitted in step S13. They are acquired
through the interface 12 and temporarily recorded, for example, in the recording unit 11 (steps
S14 and S15). Thereafter, the processing unit 10 performs in-station music reproduction and
image display according to the embodiment using the acquired music data and image data (step
S16). That is, in the reproducing apparatus ST arriving at the station, the reproduction control
unit 3 1 reproduces and emits the music corresponding to the music data using the music data
acquired in step S14 (step S16). The display control unit 3 2 displays an image corresponding to
the image data using the image data acquired in step S15 (step S16). At this time, the
reproduction control unit 3 1 reproduces the music data, for example, at the reproduction
volume designated by the operation on the operation unit 14. Further, the display mode of the
image by the display control unit 3 2 will be collectively described later.
[0059]
Next, based on the current position data or movement position data output from the sensor 13
during processing according to step S16, the processing unit 10 controls the train at which the
user of the reproduction apparatus ST is riding, etc. It is determined whether the vehicle has
departed from (step S17). When the vehicle is still stopped at the next station in the
determination of step S17 (step S17; NO), the reproduction control unit 3 1 and the display
control unit 3 2 respectively return to the step S16 to reproduce the music and image until then.
Continue to display On the other hand, when the vehicle departs from the station in the
11-04-2019
20
determination of step S17 (step S17; YES), the regeneration control unit 3 1 and the display
control unit 3 2 each start regeneration processing associated with the departure (step S18).
More specifically, the reproduction control unit 3 1 gradually reduces the reproduction volume
of the currently reproduced music from the timing of departure from the station to a final value
of zero. Further, the reproduction processing as step S18 by the display control unit 3 2 will be
collectively described later.
[0060]
After that, the processing unit 10 performs the processing of the above-mentioned steps S19 and
S20, and repeats the processing of the above-mentioned steps S3 to S12 as the reproduction
device ST moving with the train moving between the stations after the departure.
[0061]
In the reproduction apparatus ST carried by the user who gets on the train or the like that
departs from the arriving station, the music and the image associated with the arriving station,
and the processing by the above steps S13 to S20. Cross fade processing is performed using each
of the music and the image associated with the station after departure from the station.
These cross fade processes will also be described later together.
[0062]
Next, an example of the reproduction mode of the music when the reproduction process
according to the embodiment shown by the flowchart of FIG. 6 is executed, and an example of
the reproduction mode of the image corresponding thereto are described using FIG. 7 and FIG.
explain.
[0063]
First, a reproduction mode of music when reproduction processing according to the embodiment
is performed will be described using FIG. 7.
[0064]
11-04-2019
21
For example, in the case where the association table T illustrated in FIG. 5 is recorded in the
server SV, a user carrying the reproduction apparatus ST is on a train etc. traveling up the route
on which the station B and the station C are located. In this case, when the reproduction process
according to the embodiment is executed in the reproduction apparatus ST, the song 2 is
reproduced at the reproduction volume designated by the user at the stage before arriving at the
station B first (FIG. 6 step S1 to step S7 and FIG. 7 solid line and reference numeral "(song 2)").
Then, at timing t 1 when the train or the like reaches, for example, 50 meters before station B
(see FIG. 6 step S 8; YES and FIG. 7 code “t 1”), the reduction of the reproduction volume of
song 2 is started. , And the reproduction of the song 3 associated with the station B is started
(refer to FIG. 6 step S8 to step S16 and FIG. 7 broken line and code “(song 3)”).
For this song 3, the playback volume is gradually increased from the timing t 1 and reaches the
playback volume specified by the user at the timing t 2 when arriving at the station B, after
which playback by the playback volume is performed at the station B. It will be continued while
the vehicle is stopped. Thereafter, from the timing t 3 at which the departure from the station B
is detected, the reduction of the reproduction volume of the song 3 is started (FIG. 6, steps S17
and S18).
[0065]
Next, from time t 3 when the station B is departed, the gradual increase of the reproduction
volume of the song 4 associated between the station B and the station C is started (steps S3 to S7
after step S20 in FIG. 6) 7 and the code "(Song 4)", and then the reproduction of the song 4
between the stations is continued until the timing t 4 when, for example, 50 meters before the
next station C is reached (see FIG. 7 dotted line) . Thereafter, the reproduction process of the
song 5 associated with the station C and the reproduction process of the song 6 associated
between the stations after the departure of the station C are repeated as illustrated in FIG.
[0066]
On the other hand, as a display mode of the image according to the embodiment in the
reproduction device ST, for example, in the state of arriving at Shinjuku Station on the Yamanote
line, for example, the image illustrated in FIG. 8A is displayed on the display 2 2 of the
reproduction device ST. It is displayed (see step S16 above). In the example illustrated in FIG. 8A,
11-04-2019
22
in the association table T corresponding thereto, the music "Ikuo" of the singer "Kohko" is
associated with Shin-Okubo Station one ahead on the Yamanote line, and the singer " The song
"Let's Gogo" of the song "て" is associated with Shinjuku Station, and further, the song "ACE" of
the singer "乙 乙" is associated with Yoyogi Station. Further, along with this, in the association
table T, a music singer display 101 and a music image 100 indicating the music “Aoo” are
associated with Shin-Okubo Station, and a music showing the song “Let's Gogo” to Shinjuku
Station. A singer display 103 and a music image 102 are associated with each other, and a music
singer display 105 indicating a music “ACE” and a music image 104 are associated with
Yoyogi Station. Note that, in the association table T corresponding to the case illustrated in FIG.
8, unlike in the case of the association table T illustrated in FIG. 5, it is assumed that no image
and music are associated between the stations.
[0067]
When the user carrying the playback apparatus ST is on a train or the like stopping at Shinjuku
Station (see steps S1 and S2 and steps S13 to S15 in FIG. 6), the playback control unit 31 is used
as music. Thus, the music "Let's Gogo" is reproduced (see step S16 in FIG. 6). In this case, the
display 2 2 includes the music singer display 101 and the music image 100, the music singer
display 103, and the arrow 110 indicating that the train is moving from Shin-Okubo Station to
Shinjuku Station and then to Yoyogi Station. The music image 102 and the music singer display
105 and the music image 104 are displayed. At this time, the music singer display 103 and the
music image 102 showing the music "Let's Gogo" associated with the Shinjuku station where the
playback device ST is currently located are displayed in the center of the display 2 2 together
with the characters "playing" Be done. On the other hand, the music singer display 101 and the
music image 100 showing the music “Aoo” associated with Shin-Okubo Station, which
corresponds to the station one before, are smaller at the left end of the display 2 2 with the word
“past”. Is displayed. The music singer display 105 and the music image 104 showing the
music “ACE” associated with Yoyogi Station, which corresponds to the station one more ahead,
are the same as the music singer display 101 and the music image 100 at the right end of the
display 2 2 Displayed in size with the word "next". Furthermore, on the display 2 2 in this case, a
route display 200 indicating that it is a Yamanote line extending from Shin-Okubo Station to
Shinjuku Station and Yoyogi Station near the lower side, together with a station display 201
indicating Shinjuku Station, It is displayed using green representing the Yamanote Line and in a
manner resembling a station name signboard at an actual station. By these, it becomes an image
which can be intuitively recognized that it is currently stopping at Shinjuku station.
[0068]
11-04-2019
23
Next, in a state where the image illustrated in FIG. 8A is displayed on the display 2 2, when the
train on which the user carrying the reproduction apparatus ST rides leaves Shinjuku Station (see
FIG. 6, step S 17; YES The music "Let's Gogo" associated with Shinjuku Station is continuously
reproduced as the music, but the above-mentioned music singer display first associated with
Shin-Okubo Station as illustrated in FIG. 8B. The music image 101 and the music image 100 are
moved to the left end of the display 2 2 and smaller than in the case illustrated in FIG. 8A, and
are displayed together with the “past” character (see FIG. 6 step S18). In addition, the music
singer display 103 and the music image 102 associated with the departure Shinjuku station
move slightly leftward from the center of the display 2 2 and slightly smaller than in the case
illustrated in FIG. It is displayed together with the word "playing" (see step S18 in FIG. 6).
Furthermore, the music singer display 105 and the music image 104 associated with Yoyogi
Station arriving from now will move slightly to the left of the display 2 2 and slightly larger than
in the case illustrated in FIG. It is displayed together with the characters "next" (see step S18 in
FIG. 6). In addition, as illustrated in FIG. 8 (b), the route display 200 displayed in the case
illustrated in FIG. 8 (a) is moving from Shinjuku to Yoyogi from the timing of leaving Shinjuku
station. It changes to the green route display 210 which shows that (refer FIG. 6 step S18). In this
route display 210, a current position mark PM indicating the position of the train etc. moving
from Shinjuku Station to Yoyogi Station and a user carrying the reproduction apparatus ST are
present at the position of the train etc. Is displayed together with the user mark YM. As a result,
it becomes an image that can be intuitively recognized that it is currently moving from Shinjuku
Station to Yoyogi Station (in other words, that the next station is Yoyogi Station). In the case
illustrated in FIG. 8 (b), the display positions of the current position mark PM and the user mark
YM are aligned with the direction of the next station (rightward in FIG. 8 (b)) in accordance with
movement of a train It may be configured to move to
[0069]
In any of the examples shown in FIGS. 8A and 8B, the music data corresponding to the music
currently displayed with the associated image is operated, for example, when purchasing via the
electronic commerce system. The purchase button 120 is displayed on the display 2 2 together
with the image and the like.
[0070]
As described above, according to the reproduction process according to the embodiment, the
current position data, the movement position data, the direction data, the music data, and the like
are acquired, and the acquired music data and image data are associated with each other.
Reproduce and display in association with the station or station and the route and the moving
11-04-2019
24
direction.
At this time, according to departure from the station or approach to the station, the reproduction
mode of the music data and the display mode of the image data which were output before
departure or approach are the reproduction modes corresponding to the departure or approach
Control is made to change to the display mode (see step S9 and step S18 in FIG. 6). Therefore,
when reproducing a music piece associated with a station or a station, the reproduction mode
and the display mode are changed as the train departs from or approaches the station, and
therefore the user of the reproduction apparatus ST While being able to call attention to the said
departure or approach, interest to the music itself may also be increased.
[0071]
Also, in response to the approach to the station, control is made to lower the playback volume of
the song that has been played so far, and after the approach to the station, the song that has
been played so far is associated with that station Switch to another song and play. Therefore, the
approach of the station can be effectively recognized by the user, and the music associated with
the station after the approach can be reproduced quickly.
[0072]
Furthermore, the color or shape indicating the line to which the station or the like belongs (in the
example shown in FIG. 8, the green color of the Yamanote line and the display form resembling
the station name sign of the station) (See Figure 8). Therefore, while moving a station etc., the
image linked | related with the said station etc. can be displayed.
[0073]
Furthermore, in response to the approach to the station, the display is controlled so that the
image associated with the approaching station becomes gradually larger, and in association with
the departure from the station, the display is associated with the station to be departed Since the
display is controlled to gradually reduce the size of the image (see FIG. 8 (b)), the related image
can be displayed more interestingly.
[0074]
11-04-2019
25
At this time, from the timetable database TDB recorded in the server device SV, time data
indicating the estimated arrival time to each station or the scheduled departure time from the
station is acquired, and these are not shown The comparison may be configured to detect
approach to the station or departure from the station.
In this case, it is possible to accurately detect departure or approach based on the time of day,
and to perform playback control of the necessary music.
[0075]
In addition, in the case where the access to the station or the departure from the station is
detected by receiving identification data transmitted from a train or the like on which the user
carrying the reproduction apparatus ST rides or from the premises of the station, It is possible to
accurately detect the approach or departure based on the identification data acquired from a
train or the like, and to perform reproduction control of the necessary music.
[0076]
Furthermore, by detecting the acceleration applied to the reproduction device ST by the sensor
13, the deceleration accompanying the approach to the station or the acceleration accompanying
the departure from the station is detected, thereby approaching the station or from the station It
may be configured to detect a departure.
In this case, it is possible to accurately detect departure or approach from the station based on
the acceleration applied to the playback device ST, and perform playback control of necessary
music.
[0077]
Furthermore, route data or direction data indicating the movement may be acquired from a train
or a station on which a user carrying the reproduction apparatus ST gets on. In this case, even
when the playback device ST is moving together with a train or the like, it is possible to
accurately obtain route data and direction data and control playback of music.
11-04-2019
26
[0078]
Further, the present position of the reproducing apparatus ST detected by the sensor 13 is
recorded in the recording unit 11 retroactively to the past, thereby generating a movement locus
of the reproducing apparatus ST, and based on this, the current reproducing apparatus ST is It
may be configured to detect the route name and moving direction of a train or the like on which
the user carrying the vehicle is riding. In this case, it is possible to perform playback control of
the music suitable for the movement by using the trajectory of the movement of the reproduction
device ST.
[0079]
[Modification] Next, a modification according to the embodiment will be described. In each
modification explained below, about the same composition as reproducing device ST and server
apparatus SV concerning an example, the same member number is given and explanation of
details is omitted.
[0080]
(1) First Modified Example First, a first modified example will be described with reference to FIG.
FIG. 9 is a block diagram showing a schematic configuration of a reproduction apparatus
according to a first modification.
[0081]
In the embodiment described above, the association table T, the route database RDB, the
timetable database TDB, the music database MDB, and the music related database VDB are
recorded in the recording unit 22 of the server apparatus SV and recorded in these via the
network NW. It was configured to acquire data. On the other hand, in the first modification, as
shown in FIG. 9, the association table T etc. are recorded in a non-volatile manner in the
recording section 11A comprising the HDD or SSD etc. of the reproducing apparatus SST. The
reproduction process according to the embodiment is performed using the above data. With the
11-04-2019
27
configuration of the reproducing apparatus SST according to the first modification, the recording
capacity required as the recording unit 11A is larger than that of the recording unit 11 of the
reproducing apparatus ST according to the embodiment, but the network NW and the server
apparatus Even when there is no SV, the same reproduction processing as the reproduction
processing according to the embodiment can be performed, and the same operation and effect
can be obtained.
[0082]
(2) Second Modified Example Next, as a second modified example, in the above-described
embodiment, the reproduction volume of the music reproduced up to that point is reduced each
time it approaches each station. On the other hand, in the second modification, a user who
carries it is made to specify a station to get off in the playback device ST, and the playback
volume of the music being played so far only when approaching the specified getting-off station
It may be configured to reduce the According to the configuration of the second modification, the
user can clearly recognize the distinction between the getting-off station and the other stations,
and the approach to the getting-off station can also be clearly recognized.
[0083]
(3) Third Modified Example Next, as a third modified example, the above-described second
modified example is configured to reduce the reproduction volume of the music being
reproduced only when the user approaches the getting-off station. However, in relation to this, in
the third modification, at a preset timing at which an in-car announcement is made when
approaching the getting-off station, the playback volume of the music played before the timing is
reduced. You may configure it. According to the configuration of the third modification, it is
possible to prevent an in-vehicle announcement indicating the getting-off station from being
missed.
[0084]
(4) Fourth Modified Example Next, as a fourth modified example, in the above-described
embodiment, the reproduction volume of the music reproduced up to that point is reduced
whenever approaching each station. On the other hand, in the fourth modification, the playback
device ST causes a user who carries it to specify a station to get off, and the playback volume of
11-04-2019
28
the music being played so far only when approaching the specified get-off station. May be
configured to increase the According to the configuration of the fourth modified example, it is
possible to prevent the user of the playback device ST from getting over at the getting-off station.
[0085]
(5) Fifth Modification Finally, as the fifth modification, in the above-described embodiment, the
reproduction volume of the music reproduced up to that point is reduced each time it approaches
each station, but other than this, the fifth modification In order to reduce the playback volume of
the music being played prior to the broadcast timing, the broadcast device is pre-set at a preset
broadcast timing in which an announcement in the car is performed in a train or the like on
which a user carrying the playback device ST rides. It may be configured. Note that the broadcast
timing in this case may be configured to detect the timing before the time set in advance
retroactively from the time of arrival at the next station, or identification data transmitted to the
inside of the vehicle from the train or the like It may be configured to detect by receiving.
According to the configuration of the fifth modified example, it is possible to prevent an invehicle announcement in the train or the like from being missed.
[0086]
(6) Others In the above-described embodiment and modifications, the present invention is
applied to the case where the user carrying the reproduction apparatus ST gets on the train or
the like to move, but in addition to this, for example, the user moves the expressway The present
invention may be configured to be applied to the reproduction processing of music in the
reproduction device ST carried by the user who rides the vehicle. In this case, the station
according to the embodiment and each modification corresponds to a service area, a parking
area, or an interchange, and the distance between the stations according to the embodiment and
each modification corresponds to, for example, a section between the service area and the
parking area Do.
[0087]
Furthermore, in the embodiment and each modification described above, the image displayed on
the reproduction device ST is an image related to the music to be reproduced on the
reproduction device ST. The image associated with the article or the like may be displayed in
correspondence with the station or the movement between the stations.
11-04-2019
29
[0088]
Furthermore, a program corresponding to the flowchart shown in FIG. 6 is recorded in a
recording medium such as an optical disk or a hard disk, or obtained via a network such as the
Internet, and read out to a general-purpose microcomputer etc. It is also possible to cause the
microcomputer or the like to function as the processing unit 10 of the playback device ST
according to the embodiment by executing the processing.
[0089]
DESCRIPTION OF SYMBOLS 1 Acquisition means (internal interface) 2 Output means 2 1 Speaker
2 2 Display 3 Control means 3 1 Reproduction control part 3 2 Display control part 100, 102,
104 Music image 101, 103, 105 Music singer display 200, 210 Route display 201 Station
display S Output device ST, ST1, ST2, STn, SST Reproduction device SS Reproduction system T
Association table RDB Route database MDB Music database VDB Music related database PM
Current position mark YM User mark
11-04-2019
30
Документ
Категория
Без категории
Просмотров
0
Размер файла
49 Кб
Теги
description, jp2016188959
1/--страниц
Пожаловаться на содержимое документа