close

Вход

Забыли?

вход по аккаунту

?

DESCRIPTION JP2014175932

код для вставкиСкачать
Patent Translate
Powered by EPO and Google
Notice
This translation is machine-generated. It cannot be guaranteed that it is intelligible, accurate,
complete, reliable or fit for specific purposes. Critical decisions, such as commercially relevant or
financial decisions, should not be based on machine-translation output.
DESCRIPTION JP2014175932
Abstract: An approach of an obstacle can be detected, and a user can be guided in a direction to
retract from the obstacle. A direction estimation unit estimates a sound source direction based on
acoustic signals of a plurality of channels, a guidance direction determination unit determines a
guidance direction based on a sound source direction estimated by the direction estimation unit,
and a direction presentation unit Presents the guidance direction determined by the direction
determination unit. The sound pickup units that record the acoustic signal may be arranged at
different positions, and the direction estimation unit may estimate the sound source direction
based on, for example, the intensity ratio of the acoustic signal between the channels. [Selected
figure] Figure 2
Electronics
[0001]
The present invention relates to an electronic device.
[0002]
Some electronic devices have a function of providing various information to the user, for
example, a function of reproducing contents such as music, voice, and video, a call function, an
electronic mail function, and the like.
Among these electronic devices, for example, there are electronic devices such as smartphones
11-04-2019
1
and other multifunctional mobile phones, tablet terminals, etc., which are configured to be small
and lightweight, and are widely used. When the user walks and uses these functions, safety may
not be secured because they do not notice surrounding people or objects. Therefore, it has been
proposed to alert the user to surrounding people and objects.
[0003]
The alarm notification method described in Patent Document 1 detects an approach of a person,
an object, or the like emitting a sound wave in a voice band in an environment in which the user
uses an earphone with the mobile phone to block external sound. Turn down the volume of the
tone that the user is listening to with the earphones. And, in the said method, "the car-like object
is approaching. Play alarm notification voices such as
[0004]
JP, 2010-128789, A
[0005]
However, in the alarm notification method described in Patent Document 1, although the alarm
notification voice is clearly heard by relatively increasing the volume, the user is made to
recognize the direction in which an obstacle such as a person or an object approaches. Is not
enough.
In addition, when the user operates the device while wearing a device that covers the ear, such as
earphones and headphones, it may not be possible to perceive the direction of the sound emitted
by the obstacle. Therefore, the user can not determine the direction in which the user withdraws
from the obstacle, and there is a possibility that an accident of contacting or colliding with the
obstacle may occur.
[0006]
The present invention has been made in view of the above points, and provides an electronic
device capable of guiding in a direction of retracting from an obstacle.
11-04-2019
2
[0007]
The present invention has been made to solve the above problems, and one aspect of the present
invention is a direction estimation unit that estimates a sound source direction based on acoustic
signals of a plurality of channels, and a sound source estimated by the direction estimation unit.
An electronic device comprising: a guidance direction determination unit that determines a
guidance direction based on a direction; and a direction presentation unit that presents the
guidance direction determined by the guidance direction determination unit.
[0008]
According to the present invention, it is possible to detect the approach of the obstacle and to
guide the user in the direction of retracting from the obstacle.
[0009]
FIG. 1 is a plan view of an electronic device according to a first embodiment of the present
invention.
It is a schematic block diagram which shows the structure of the electronic device which
concerns on this embodiment.
It is a flowchart which shows the process which estimates the sound source direction which
concerns on this embodiment.
It is a figure which shows the example of relative direction information. It is a figure which shows
the example of arrangement | positioning of the sound collection part which concerns on this
embodiment. It is a flowchart which shows the process which determines the guidance direction
which concerns on this embodiment. It is a conceptual diagram which shows the example of a
sound source direction, a displacement angle, and a guidance direction. It is a conceptual
diagram which shows the example of guidance information. It is a figure which shows the
example of the timing which a light emission part light-emits. It is a flowchart which shows the
information processing which concerns on this embodiment. It is a top view of the electronic
device concerning a 2nd embodiment of the present invention. It is a schematic block diagram
which shows the structure of the electronic device which concerns on this embodiment. It is a
figure which shows the relationship of the azimuth before and behind correction | amendment. It
is a figure which shows the example of arrangement | positioning of the sound collection part
11-04-2019
3
which concerns on this embodiment.
[0010]
First Embodiment Hereinafter, a first embodiment of the present invention will be described with
reference to the drawings. FIG. 1 is a plan view of an electronic device 1 according to the present
embodiment. In FIG. 1, the right side is the X direction (horizontal direction), and the lower side
is the Y direction (vertical direction). The directions indicated by the X direction and the Y
direction are the same as in FIGS. 5, 8, 11, and 14 described later.
[0011]
The electronic device 1 is, for example, a multifunctional mobile phone (including a so-called
smart phone), a tablet terminal device, a personal computer, and the like. In the following, a
description will be given by taking a multi-function mobile phone as an example of the electronic
device 1. Electronic device 1 includes housing 101, display portion 102, N (N is an integer
greater than 3 or 3, 4 in the example shown in FIG. 1) sound collecting portions 103-1 to 103-N,
M1 ( M1 is an integer larger than 2 or 2, the light emitting units 104-1 to 104-M1 in the
example shown in FIG. 1 2), and M2 (M2 is an integer larger than 2 or 2, an example shown in
FIG. 1 In the above, the second embodiment is configured to include the vibration units 105-1 to
105-M2 of 2).
[0012]
In the following description, the N sound collection units 103-1 to 103-N may be simply referred
to as the sound collection unit 103 when they are not distinguished from one another. When the
M1 light emitting units 104-1 to 104-M1 and the M2 vibrating units 105-1 to 105-M2 are not
distinguished from one another, they may be referred to as the light emitting unit 104 and the
vibrating unit 105 in the same manner. .
[0013]
(Configuration of Electronic Device 1) Next, the configuration of the electronic device 1 according
11-04-2019
4
to the present embodiment will be described. FIG. 2 is a schematic block diagram showing the
configuration of the electronic device 1 according to the present embodiment. The electronic
device 1 includes a display unit 102, a sound collection unit 103, a light emitting unit 104, a
vibration unit 105, an acceleration sensor 106, and a control unit 110.
[0014]
The display unit 102 covers most of the surface of the electronic device 1 and displays an image
based on the image signal input from the control unit 110. The display unit 102 is, for example, a
liquid crystal display panel, an organic EL (Electroluminescence) display panel, or the like. The
display unit 102 may be a touch panel integrated with a touch sensor that detects a position at
which an operation object such as a user's finger contacts.
[0015]
Each of the sound collection units 103-1 to 103-N converts the vibration due to the sound wave
that has arrived to an acoustic signal that is an electric signal, and outputs the converted acoustic
signal to the direction estimation unit 111. The distinction between the N sound collection units
103-1 to 103-N and the distinction between the acoustic signals converted by the sound
collection units 103-1 to 103-N are called channels. The sound collection unit 103 is, for
example, a microphone having one sensitivity axis (unidirectionality) with the highest sensitivity
of the sound wave in each direction. An arrangement example of the sound collection units 1031 to 103-N will be described later.
[0016]
The light emitting unit 104 emits light while power is supplied from the control unit. The light
emitting unit 104 is, for example, a light emitting diode. In the example illustrated in FIG. 1, the
light emitting units 104-1 and 104-2 are disposed on surfaces directly above the upper left end
and the upper right end of the display unit 102. This position is a position rarely covered by the
user's hand when the user grips the electronic device 1.
[0017]
11-04-2019
5
The vibration unit 105 vibrates at a predetermined frequency (for example, 100 Hz) while power
is supplied from the control unit 110. The vibration unit 105 is, for example, a vibrator (vibrator)
or an actuator switch. In the example illustrated in FIG. 1, the vibrating units 105-1 and 105-2
are disposed on the side surfaces of the lower left end and the lower right end of the display unit
102. This position is a position where the user's hand is often touched when the user holds the
electronic device 1.
[0018]
The acceleration sensor 106 detects an acceleration given to the own unit, and outputs an
acceleration signal indicating the detected acceleration to the control unit 110. The acceleration
sensor 106 is, for example, a triaxial MEMS (Micro-Electro-Mechanical Systems) sensor having
three sensitivity axes orthogonal to one another. The acceleration sensor 106 is disposed such
that one of the three sensitivity axes is oriented in the Z direction orthogonal to the X direction
and the Y direction.
[0019]
The control unit 110 controls the operation of the electronic device 1. The control unit 110
includes, for example, a central processing unit (CPU) and a counter, and executes various
programs by executing a program stored in a storage medium such as a read only memory
(ROM). Function can be realized. Considering in terms of these functions, the control unit 110
includes a direction estimation unit 111, a guidance direction determination unit 112, a guidance
information generation unit 113, and a tilt detection unit 114.
[0020]
The direction estimation unit 111 estimates the sound source direction at predetermined time
intervals (for example, one second) based on the N-channel acoustic signals input from the sound
collection units 103-1 to 103-N. The estimated sound source direction indicates the direction of
an object such as a vehicle generating a sound or an obstacle such as a person. The direction
estimation unit 111 outputs sound source direction information indicating the estimated sound
source direction to the guidance direction determination unit 112. The process in which the
direction estimation unit 111 estimates the sound source direction will be described later.
11-04-2019
6
[0021]
The guidance direction determination unit 112 determines the guidance direction to be guided to
the user based on the sound source direction indicated by the sound source direction information
input from the direction estimation unit 111. The determined guidance direction is, for example,
a retraction direction which guides retraction from an obstacle relatively approaching the user.
The guidance direction determination unit 112 outputs guidance direction information indicating
the determined guidance direction to the guidance information generation unit 113. The process
in which the guidance direction determination unit 112 determines the guidance direction will be
described later.
[0022]
The guidance information generation unit 113 presents the guidance direction indicated by the
guidance direction information input from the guidance direction determination unit 112, using
any one of the display unit 102, the light emission unit 104, the vibration unit 105, or any
combination thereof. Generate guidance information to The guidance information generation unit
113 outputs an image signal indicating the generated guidance information to the display unit
102, and an electric power indicating the guidance information to the light emitting unit 104 or
the vibration unit 105. Therefore, any one of the guidance information generation unit 113, the
display unit 102, the light emission unit 104, and the vibration unit 105, or any combination
thereof forms a direction presentation unit that presents the guidance direction. An example of
the guidance information will be described later.
[0023]
The inclination detection unit 114 detects the acceleration signal input from the acceleration
sensor 106 at predetermined time intervals (for example, 0.1 seconds), and detects the
inclination of the electronic device 1 from the horizontal surface based on the detected
acceleration signal. Do. For example, when the acceleration in the Z direction indicated by the
acceleration signal is smaller than a predetermined threshold (for example, 0.7 g, g is
gravitational acceleration), the inclination detection unit 114 detects that the electronic device 1
is inclined from the horizontal surface. judge. In that case, the inclination detection unit 114
generates an inclination detection signal indicating that the inclination has been detected, and
11-04-2019
7
the generated inclination detection signal is one of the direction estimation unit 111, the
guidance direction determination unit 112, and the guidance information generation unit 113, or
Output to any combination of these.
[0024]
When the inclination detection signal is input, the direction estimation unit 111, the guidance
direction determination unit 112, and the guidance information generation unit 113 each
estimate the sound source direction, determine the guidance direction, and generate guidance
information. It may stop. As a result, the sound source direction whose estimation accuracy is
inferior is used in the process of determining the guiding direction described later, and it is
possible to avoid that the wrong guiding direction is presented to the user. In addition, when the
tilt detection unit 114 generates a tilt detection signal, the control unit 110 urges the user to
deteriorate the accuracy of the presented guidance direction or to hold the electronic device 1
parallel to the horizontal surface by the user. The warning image may be displayed on the display
unit 102.
[0025]
Next, the process in which the direction estimation unit 111 estimates the sound source direction
will be described. FIG. 3 is a flowchart showing a process of estimating the sound source
direction according to the present embodiment. (Step S101) The direction estimation unit 111
determines the intensity (for example, power, average absolute value amplitude, etc.) at
predetermined time intervals (for example, 50 ms) with respect to the acoustic signals
respectively input from the sound collection units 103-1 to 103-4. Calculate). Thereafter, the
process proceeds to step S102. (Step S102) The direction estimation unit 111 selects the channel
Ma related to the sound collection unit 103 with the highest calculated strength. Thereafter, the
process proceeds to step S103. (Step S103) The direction estimation unit 111 selects the channel
Mb related to the sound collection unit 103 having the next highest calculated strength.
However, the channel related to the sound collection unit (for example, the sound collection unit
103-2) facing the sound collection unit (for example, the sound collection unit 103-4) related to
the channel Ma is not a target of the candidate for the selected channel Mb. is there. Therefore,
in the example shown in FIG. 1, one of the channels respectively corresponding to the two sound
pickup units adjacent to the sound collection unit according to the channel Ma is selected as the
channel Mb. Thereafter, the process proceeds to step S104. (Step S104) The direction estimation
unit 111 calculates an intensity ratio (inter-channel intensity ratio) of the intensity of the channel
Mb to the intensity of the channel Ma. Thereafter, the process proceeds to step S105.
11-04-2019
8
[0026]
(Step S105) The direction estimation unit 111 determines the relative direction corresponding to
the calculated inter-channel intensity ratio. The relative direction is a sound source direction
based on the direction of the sound collection unit 103 related to the channel Ma, and is a
direction in which the direction of rotation from that direction toward the sound collection unit
103 related to the channel Mb is positive. . In the direction estimation unit 111, for example,
relative direction information indicating the relationship between the inter-channel intensity ratio
and the relative direction is stored in advance in a storage unit provided in the own unit. An
example of relative direction information will be described later. If there is a relative direction
corresponding to the calculated inter-channel intensity ratio in the stored relative direction
information, the direction estimation unit 111 reads the relative direction. If there is no relative
direction corresponding to the calculated inter-channel intensity ratio, the direction estimation
unit 111 reads each relative direction corresponding to the inter-channel intensity ratio adjacent
to the calculated inter-channel intensity ratio, and indicates the read relative direction.
Interpolation or extrapolation is performed to calculate the relative direction corresponding to
the calculated channel. Thereafter, the process proceeds to step S106.
[0027]
(Step S106) The direction estimation unit 111 estimates an absolute direction based on the
channel Ma as the sound source direction with respect to the determined relative direction. Here,
the direction estimation unit 111 adds a relative direction to the direction of the sound collection
unit related to the channel Ma with a polarity corresponding to the rotation direction from that
direction to the direction of the sound collection unit 103 related to the channel Mb to make an
absolute direction calculate. Thereafter, the process of estimating the sound source direction
shown in FIG. 3 is ended.
[0028]
The process of calculating the absolute direction based on the relative direction in step S106 will
be described later together with an arrangement example of the sound collection unit 103. In the
present embodiment, the direction estimation unit 111 is not limited to the process shown in FIG.
3 as the process of estimating the sound source direction, and based on the difference in arrival
11-04-2019
9
time of sound waves between the sound collection units 103-1 to 103-N. The sound source
direction may be estimated. In that case, if the sound collection units 103-1 to 103-N are not
distributed on the same straight line, they may be arranged at any position on the surface of the
electronic device (FIG. 1).
[0029]
Next, an example of relative direction information will be described. FIG. 4 is a diagram showing
an example of relative direction information. When the electronic device 1 (FIG. 1) is installed on
a horizontal surface, the relative direction information shown in FIG. 4 corresponds to the
channel Ma of the one of the sound collection units 103 (for example, the sound collection unit
103-4) and the other 1 The intensity ratio between channels measured between the channel Mb
of one (for example, the sound collection unit 103-3) is shown. The relative direction is the
direction of the sound source when the direction of any one sound pickup unit 103 is 0 degrees
and the direction of the other sound pickup unit 103 is 90 degrees.
[0030]
In FIG. 4, the left column shows the intensity ratio between channels, and the right column shows
the relative direction. In this example, it indicates that the relative direction increases as the
inter-channel intensity ratio increases. The relative direction takes any value from 0 degrees to
45 degrees, and the inter-channel intensity ratio takes any value from a minimum value 0.002 to
a maximum value 1. The relative direction of 0 degrees indicates the case where the sound
source is in the direction of a specific sound collection unit 103. Therefore, the sound collection
unit 103 mainly acquires an acoustic signal from the sound source, and the other sound
collection unit 103 hardly acquires an acoustic signal from the sound source, so that the interchannel intensity ratio is minimized. On the other hand, the relative direction of 45 degrees
indicates the case where there are sound sources in the same direction from the two adjacent
sound collection units 103. Therefore, in the two sound collection units 103, the intensities of
the acoustic signals acquired from the sound sources become substantially equal, so the interchannel intensity ratio has a maximum value of 1.
[0031]
Next, an arrangement example of the sound collection units 103-1 to 103-4 will be described.
11-04-2019
10
FIG. 5 is a view showing an arrangement example of the sound collection units 103-1 to 103-4
according to the present embodiment. The arrangement of the sound collection units 103-1 to
103-4 shown in FIG. 5 is the same as the arrangement shown in FIG. 1, but the light emitting
units 104-1 and 104-2 and the vibrating units 105-1 and 105-2 are illustrated. Is omitted. The
sound collection units 103-1, 103-2, 103-3, and 103-4 are disposed on the same plane on the
upper side, the left side, the lower side, and the right side of the electronic device 1, respectively,
and their sensitivity axes are arranged. It is directed in the direction perpendicular to the side
being taken, that is, upward, leftward, downward, rightward. In FIG. 5, the sensitivity axis is
indicated by a dashed dotted line, and their intersection is the origin O. In this example, the
sound collection units 103-2 and 103-4 are disposed above the midpoint in the Y direction and
do not need to be covered by the user holding the electronic device 1. Therefore, the process of
estimating the sound source direction (longitudinal holding) when the short side of the electronic
device 1 is gripped in the left and right direction of the user is not disturbed.
[0032]
In FIG. 5, θ is determined in step S105 (FIG. 3) when the channel relating to the sound collection
unit 103-4 is defined as the channel Ma and the channel relating to the sound collection unit
103-3 is defined as the channel Mb. An example of the relative direction that has been α
represents an example of the sound source direction (absolute direction) estimated in step S106
(FIG. 3) when the relative direction θ is given. That is, the sound source direction is the direction
(270 ° counterclockwise) of the sound collection unit 103-4 related to the channel Ma when the
upper side is the reference (0 degree), and the sound collection unit 103 related to the channel
Mb from that direction. Calculated by adding the relative direction θ with the polarity (negative
(−1) in the case of clockwise) corresponding to the rotation direction (clockwise) in the −3
direction (180 °). The polarity corresponding to the counterclockwise direction is positive (+1).
[0033]
Next, the process in which the guidance direction determination unit 112 determines the
guidance direction will be described. FIG. 6 is a flowchart showing a process of determining a
guiding direction according to the present embodiment. (Step S201) The guidance direction
determination unit 112 stores the sound source direction information input from the direction
estimation unit 111 at the current time t2 in the storage unit provided in the own unit. The
sound source direction indicated by the input sound source direction information is represented
by β. The guidance direction determination unit 112 reads the sound source direction
information input at time t1 last time from the storage unit provided in the own unit. The sound
11-04-2019
11
source direction indicated by the read sound source direction information is represented as α.
The storage unit stores at least two sound source directions. Thereafter, the process proceeds to
step S202. (Step S202) The guidance direction determination unit 112 calculates a displacement
angle δ which is a difference between the sound source direction β and the sound source
direction α. The displacement angle δ indicates a change in the sound source direction from the
previous time t1 to the current time t2. Thereafter, the process proceeds to step S203. (Step
S203) The guiding direction determination unit 112 determines whether the displacement angle
δ is 0 degree or larger than 0 degree. If it is determined that the angle is greater than 0 degrees
or 0 degrees (YES in step S203), the process proceeds to step S205. If it is determined that the
angle is smaller than 0 degree (NO in step S203), the process proceeds to step S204. (Step S204)
The guidance direction determination unit 112 adds 360 degrees to the displacement angle δ,
and then proceeds to step S205. The displacement angle δ falls within the range between 0
degrees and 360 degrees by steps S203 and S204.
[0034]
(Step S205) The guiding direction determination unit 112 determines whether the displacement
angle δ is smaller than 180 degrees or 180 degrees. If it is determined that the angle is smaller
than 180 degrees or 180 degrees (YES in step S205), the process proceeds to step S206. If it is
determined that the angle is larger than 180 degrees (NO in step S205), the process proceeds to
step S207. (Step S206) The guidance direction determination unit 112 calculates the guidance
direction γ by adding a predetermined angle ψ (for example, 90 degrees) to the sound source
direction α at the previous time t1. Thereafter, the process according to FIG. 6 ends. (Step S207)
The guidance direction determination unit 112 subtracts the predetermined angle 減 じ from the
sound source direction α at the previous time t1 to calculate the guidance direction γ.
Thereafter, the process according to FIG. 6 ends.
[0035]
Next, examples of the sound source directions α and β, the displacement angle δ, and the
guiding direction γ will be described. FIG. 7 is a conceptual view showing an example of the
sound source directions α and β, the displacement angle δ, and the guiding direction γ. The
upper part of FIG. 7 indicates the front of a certain user, and in the present embodiment, the
sound source directions α and β and the guiding direction γ are calculated with reference to
this direction. Circles indicating the users U1 and U2 indicate positions where a certain user is
located at the previous time t1 and the current time t2. A thick arrow starting at the upper right
of the circle indicating the user U1 and ending at the lower right of the circle indicating the user
11-04-2019
12
U2 indicates that the user has moved between the previous time t1 and the current time t2 as a
pedestrian. Circles indicating the vehicles C1 and C2 indicate positions where a certain vehicle is
located at the previous time t1 and the current time t2. The thick arrow starting from directly
above the circle representing the vehicle C1 and ending at directly below the circle representing
the vehicle C2 has traveled between the previous time t1 and the current time t2 as an obstacle
that becomes the sound source of the vehicle. Show. Therefore, the sound source direction α
indicates the direction of the vehicle C1 from the user U1, and the sound source direction β
indicates the direction of the vehicle C2 from the user U2. In this example, since the
displacement angle δ is larger than 0 degree and smaller than 180 degrees, the guiding
direction γ is calculated by adding a predetermined angle ψ to the sound source direction α.
[0036]
In steps S206 and S207 (FIG. 6), the guidance direction γ is calculated based on the sound
source direction α because the sound source direction is more than the sound source direction
β in an environment where pedestrians pass through long and narrow routes such as road
traffic. This is because α is closer to the moving direction of an obstacle (sound source) such as
a vehicle. Hereinafter, this environment is called a traffic environment. This is because in a traffic
environment, it is customary for obstacles to relatively approach from the back or front of a
pedestrian (user). In the present embodiment, the user touches or collides with an obstacle
moving in a direction approximate to the sound source direction α by moving in the guiding
direction γ separated by a predetermined angle 音源 (for example, 90 degrees) from the sound
source direction α. The fear is reduced. In the traffic environment, the angle between the
traveling direction of the obstacle and the guiding direction γ is approximately 90 degrees.
[0037]
When the displacement angle δ is in a predetermined range (for example, 0 ° -30 °) from 0
°, the obstacle approaches from the rear to the front of the pedestrian or from the front to the
rear of the pedestrian Indicates to do. When the displacement angle δ is a predetermined range
(for example, 330 degrees-360 degrees) from 360 degrees, the obstacle approaches from the
right rear to the front of the pedestrian or from the left front to the rear of the pedestrian
Indicates Step 205 (FIG. 6) is a process of discriminating the both, and it is judged whether or not
the change of the displacement angle δ, that is, the sound source direction α is
counterclockwise with respect to the pedestrian. By properly using steps S206 and S207 (FIG. 6)
according to the determination result, the guidance direction γ calculated by the pedestrian can
be obtained as a direction that can be effectively retracted from the obstacle.
11-04-2019
13
[0038]
When the sound source direction α is in a predetermined range (for example, 60 degrees-120
degrees, 240 degrees-300 degrees) from 90 degrees or 270 degrees, the predetermined angle ψ
is an angle larger than 90 degrees (for example, And 120 degrees). In this case, in a traffic
environment, the obstacle (sound source) is on the left side or the right side at a distance close to
the pedestrian (user), that is, the obstacle passes the left side or the right side of the pedestrian .
In steps S206 and S207 (FIG. 6), the guidance direction γ calculated by adding (or subtracting)
an angle larger than 90 degrees as the predetermined angle α to the sound source direction α
is a direction in which the pedestrian can effectively retract from the obstacle Obtained as In this
case, a delay in processing from the recording of the sound wave to the guidance of the guiding
direction γ to the pedestrian is taken into consideration, and the angle between the advancing
direction of the obstacle and the guiding direction γ approximates 90 degrees. It is.
[0039]
Next, an example of the guidance information will be described. FIG. 8 is a conceptual diagram
showing an example of guidance information. The arrangement of the display unit 102, the light
emitting units 104-1 and 104-2, and the vibrating units 105-1 and 105-2 shown in FIG. 8 is the
same as the arrangement shown in FIG. However, in FIG. 8, illustration of the sound collection
units 103-1 to 103-4 is omitted. The guidance information generation unit 113 may present the
guidance direction indicated by the guidance direction information input from the guidance
direction determination unit 112 as described below. (1) The display unit 102 displays a figure
representing the guidance direction. The guidance information generation unit 113 generates an
image signal related to a graphic representing the guidance direction, and outputs the generated
image signal to the display unit 102. The form of the figure is not limited to the arrow ar1
directed to the guiding direction as shown in FIG. 8, and any figure, for example, a triangle may
be used as long as it is a figure indicating the guiding direction. Thereby, the guidance direction
is freely presented over 360 degrees of all directions.
[0040]
(2) The light emitting units 104-1 and 104-2 emit light in the order in which they are arranged in
the guiding direction or the direction most approximate to the guiding direction. The guidance
11-04-2019
14
information generation unit 113 sequentially supplies power to each of the light emitting units
104-1 and 104-2 in that order. The timing of supplying power and emitting light will be
described later. The order arranged in the direction closest to the direction is, for example, the
order from the light emitting unit 104-2 to the light emitting unit 104-1 when the guiding
direction is the direction indicated by the arrow ar1. The direction to be presented is limited by
the distribution of the light emitting units 104 and the number of the light emitting units 104.
However, since a figure indicating the direction is not displayed on the display unit 102,
functions such as a call used by the user are not disturbed. .
[0041]
(3) The vibrating portions 105-1 and 105-2 are vibrated in the order in which they are arranged
in the guiding direction or the direction most approximate to the guiding direction. The guide
information generation unit 113 sequentially supplies power to each of the vibration units 105-1
and 105-2 in that order. The power supply timing is the same as (2). Also, as in (2), the functions
used by the user are not disturbed. In addition, even when the display unit 102 is not viewed, it is
possible to recognize the guidance direction presented by the sense of touch.
[0042]
Next, the timings at which the light emitting units 104-1 and 104-2 emit light will be described.
FIG. 9 is a diagram illustrating an example of the timing at which the light emitting units 104-1
and 104-2 emit light. In FIG. 9, the vertical axis represents each of the light emitting units, and
the horizontal axis represents time. A thick line pointing in the horizontal direction indicates a
section in which the light emitting units 104-1 and 104-2 emit light. In this example, in order
from the light emitting unit 104-2 at one end to the light emitting unit 104-1 at the other end,
power is sequentially supplied and emitted for each predetermined light emitting time τ1 (for
example, 0.2). After the light emitting unit 104 that emits light reaches the light emitting unit
104-1 at the other end, the light emitting unit 104 returns to the light emitting unit 104-2 at one
end, and the light emitting unit 104 that emits light again is switched at time interval τ1. In the
case where the number M1 of the light emitting units 104-1 and 104-2 is two, it is from when
the light emitting unit 104 emitting light returns from the light emitting unit 104-1 at the other
end to the light emitting unit 104-2 at one end A turn-off time τ 2 (preferably, a time longer
than τ 1, for example, 0.6 seconds) during which power is not supplied and is turned off is
provided. Thereby, the user can visually recognize the moving direction of the light emitting unit
104 emitting light, and can distinguish it from the simple replacement of the light emitting unit
104 emitting light.
11-04-2019
15
[0043]
Next, information processing according to the present embodiment will be described. FIG. 10 is a
flowchart showing the information processing according to the present embodiment. (Step S301)
The direction estimation unit 111 receives N-channel acoustic signals respectively recorded by
the sound collection units 103-1 to 103-N. Thereafter, the process proceeds to step S302 (step
S302) An acceleration signal is input from the acceleration sensor 106 to the tilt detection unit
114. Thereafter, the process proceeds to step S303. (Step S303) The tilt detection unit 114
determines whether the tilt of the electronic device 1 is detected based on the acceleration signal.
If it is detected (YES in step S303), the process proceeds to step S304. If not detected (NO in step
S303), the process proceeds to step S308.
[0044]
(Step S304) The direction estimation unit 111 estimates the sound source direction by, for
example, performing the process shown in FIG. 3 based on the N channel sound signals.
Thereafter, the process proceeds to step S305. (Step S305) Based on the sound source direction
estimated by the direction estimation unit 111, the guidance direction determination unit 112
performs, for example, the process shown in FIG. 6 to determine the guidance direction.
Thereafter, the process proceeds to step S306. (Step S306) The guidance information generation
unit 113 generates guidance information for presenting the guidance direction determined by
the guidance direction determination unit 112. Thereafter, the process proceeds to step S307.
(Step S307) At least one of the display unit 102, the light emitting unit 104, and the vibration
unit 105 presents a guidance direction based on the guidance information generated by the
guidance information generation unit 113. Thereafter, the process according to this flowchart is
ended.
[0045]
(Step S <b> 308) The control unit 110 causes the display unit 102 to display a warning image
prompting the user to degrade the accuracy of the guiding direction or to hold the electronic
device 1 parallel to the horizontal surface. Thereafter, the process proceeds to step S309. (Step
S309) At least one of the process in which the direction estimation unit 111 estimates the sound
source direction, the process in which the guidance direction determination unit 112 determines
11-04-2019
16
the guidance direction, and the process in which the guidance information generation unit 113
generates guidance information is stopped. Thereafter, the process according to this flowchart is
ended.
[0046]
As described above, according to the present embodiment, the sound source direction is
estimated based on the acoustic signals of the plurality of channels, and the guiding direction in
which the guiding direction is determined based on the estimated sound source direction is
presented. The guiding direction is defined as a direction changed from the estimated sound
source direction by a predetermined angle in the direction in which the sound source direction
changes. As a result, it is possible to guide the user whose guidance direction has been presented
in the direction of retracting from the obstacle generating the sound.
[0047]
Second Embodiment Next, the configuration of an electronic device 2 according to a second
embodiment of the present invention will be described. About the same composition as the
above-mentioned embodiment, the same numerals are attached and explanation is used. FIG. 11
is a plan view of the electronic device 2 according to the present embodiment. In the electronic
device 2, all of at least four sound collection units 103 are arranged in a three-dimensional space
that is not in the same plane. In the example illustrated in FIG. 11, the electronic device 2 further
includes sound collecting units 103-5 and 103-6 on the front surface and the back surface of the
electronic device 1 (FIG. 1). However, since the sound collection unit 103-6 is disposed on the
back surface of the electronic device 2, it does not appear in FIG. 11. The X coordinate and the Y
coordinate of the sound collection unit 103-6 may be equal to those of the sound collection unit
103-5. The sound collection units 103-5 and 103-6 may be unidirectional microphones as in the
sound collection units 103-1 to 103-4. In that case, the sound collection units 103-5 and 103-6
are arranged such that the directions of the sensitivity axes are perpendicular to the front surface
and the back surface of the electronic device 1, respectively.
[0048]
FIG. 12 is a schematic block diagram showing the configuration of the electronic device 2
according to the present embodiment. The electronic device 2 includes a direction estimation
11-04-2019
17
unit 211 instead of the direction estimation unit 111 in the electronic device 1 (FIG. 2). In the
electronic device 2, the acceleration sensor 106 and the tilt detection unit 114 (FIG. 2) may be
omitted. Similar to the direction estimation unit 111, the direction estimation unit 211 estimates
the sound source direction (azimuth angle) α in the surface of the electronic device 2 based on
the acoustic signals input from the sound collection units 103-1 to 103-4. In the process, the
channel Ma related to the sound collection unit 103 having the highest calculated strength is
selected (FIG. 3, step S102). Thereafter, the direction estimation unit 211 selects the channel Mb
related to the sound collection unit 103 having the higher strength of the input acoustic signal
among the sound collection units 103-5 and 103-6. Thereafter, the direction estimation unit 211
performs the processing of steps S104 to S106 for the selected channels Ma and Mb to estimate
the sound source direction (elevation angle) φ based on the surface of the electronic device 2.
Therefore, the sound source direction in the three-dimensional space is represented by using the
azimuth angle α and the elevation angle φ in the device coordinate system with reference to the
surface of the electronic device 2.
[0049]
When the electronic device 2 includes the acceleration sensor 106 and the inclination detection
unit 114, the direction estimation unit 211 detects the gravity direction indicated by the
acceleration signal input from the acceleration sensor 106. The direction estimation unit 211
projects the estimated sound source direction (α, φ) on a horizontal plane perpendicular to the
detected gravity direction, and calculates an azimuth angle α ′ in the horizontal plane.
Therefore, the azimuth α of the sound source direction is corrected to the azimuth α ′ in the
absolute coordinate system based on the horizontal plane. Here, it is assumed that the device
coordinate system and the absolute coordinate system have a common origin O, and in the
projection of the sound source direction (α, φ), the elevation angle φ ′ is determined as 0 in
the absolute coordinate system. The direction estimation unit 211 outputs sound source
direction information indicating the corrected azimuth α ′ as a sound source direction to the
guidance direction determination unit 112.
[0050]
Next, the relationship between the azimuths α and α ′ before and after correction will be
described. FIG. 13 is a diagram showing the relationship between the azimuths α and α ′
before and after correction. An ellipse passing through the points B and F with the origin O as a
center indicates a plane parallel to the surface of the electronic device 2 in the device coordinate
system. Points B and F indicate points above and below the electronic device 2 from the origin O,
11-04-2019
18
respectively. An ellipse passing through the points B 'and F' with the origin O as a center
indicates a horizontal plane in the absolute coordinate system, and points B 'and F' indicate
points corresponding to the points B and F, respectively. The downward arrow OG starting from
the origin O indicates the gravity direction, and the horizontal plane is a plane perpendicular to
this direction. That is, the points B 'and F' are points obtained by rotating the points B and F in
the direction perpendicular to the surface of the electronic device 2 in the direction of gravity.
[0051]
Here, the sound source direction (α, φ) is the direction from the origin O to the point C
regardless of the coordinate system. The azimuth angle (before correction) α in the device
coordinate system is the angle between the line segment FO and the line segment PO, and the
point P is a point obtained by projecting the elevation angle φ at 0 with respect to the point C.
The elevation angle φ in the device coordinate system is the angle between the line segment PO
and the line segment CO. On the other hand, the azimuth angle (after correction) α 'in the
absolute coordinate system is the angle between line segment F'O and line segment P'O, point P'
projects elevation angle φ 'at point C as 0 It is a point obtained by The elevation angle φ ′ in
the absolute coordinate system is an angle between the line segment P′O and the line segment
CO. Therefore, the direction estimation unit 211 converts the sound source direction (α, φ)
estimated in the device coordinate system into the sound source direction (α ′, φ ′) in the
absolute coordinate system, thereby decreasing the tilt of the electronic device 2 Estimation
accuracy can be improved.
[0052]
Note that the electronic device 2 may further include one sound collecting unit 103-7 and 103-8
on each side of the long side. FIG. 14 is a view showing an arrangement example of the sound
collection units 103-1 to 103-8 according to the present embodiment. In this example, the long
side of the electronic device 2 is shown in the horizontal direction, but the arrangement of the
sound collection units 103-1 to 103-6 is the same as the arrangement in FIG. The sound
collection units 103-7 and 103-8 are disposed to the left of the midpoint in the X direction. Even
if the user grips the left or right of the electronic device 2, it is not covered by any of the sound
collection units 103-2 and 103-7 or the hand held by the sound collection units 103-4 and 1038. It is Therefore, even if the long side of the electronic device 2 is gripped in the left-right
direction of the user (horizontally held), the process of estimating the sound source direction is
not disturbed. Here, the direction estimation unit 211 may adopt any one of the sound collection
units 103-2 and 103-7 which has the higher intensity, in the process of estimating the sound
11-04-2019
19
source direction described above, or the sound collection unit An acoustic signal with higher
intensity 103-4 or 103-8 may be adopted. The direction estimating unit 211 is not limited to the
process shown in FIG. 3 as the process of estimating the three-dimensional sound source
direction, and the sound source direction is determined based on the difference in arrival time of
sound waves between the sound collecting units 103-1 to 103-N. It may be estimated. In that
case, if the sound collection units 103-1 to 103-8 are not distributed on the same plane, they
may be arranged at any position on the surface of the electronic device 2.
[0053]
As described above, according to the present embodiment, the sound source direction in the
three-dimensional space is estimated based on the acoustic signals of at least four channels, and
the estimated sound source direction is projected on a horizontal plane and corrected. As a result,
even when the electronic device according to the present embodiment is inclined, the sound
source direction can be corrected accurately, so that a more accurate guidance direction can be
determined.
[0054]
Third Embodiment Next, the configuration of an electronic device 3 according to a third
embodiment of the present invention will be described. About the same composition as the
above-mentioned embodiment, the same numerals are attached and explanation is used. In the
electronic device 3 (not shown), in the electronic device 1 (FIG. 2) and the electronic device 2
(FIG. 12), the guidance direction determination unit 112 increases the volume of the sound signal
input from the sound collection unit 103. In the case where it is determined, the process of
determining the guiding direction (FIG. 6) may be performed, and in the other cases, the process
may not be performed. The guidance direction determination unit 112 can determine, for
example, whether or not the volume is increased based on whether the intensity of the channel
Ma at the current time t2 is larger than the intensity of the channel Ma at the previous time t1. In
this determination, the guidance direction determination unit 112 may use the sum of the
intensities of the N channel acoustic signals instead of the intensity of the channel Ma.
[0055]
Therefore, the guidance direction is determined when the obstacle serving as the sound source
11-04-2019
20
approaches the user, and the guidance direction is not determined when the obstacle is away
from the user. Since an obstacle moving away is unlikely to touch or collide with the user, it is
possible to omit the useless processing of defining a guiding direction for such obstacle.
[0056]
In the embodiment described above, when the user is not using the electronic devices 1, 2, 3, the
process in which the direction estimation units 111 and 211 estimate the sound source direction,
and the process in which the guidance direction determination unit 112 determines the guidance
direction , And the guidance information generation unit 113 may stop the process of generating
guidance information. When not in use, for example, when the display unit 102 does not display
an image when the time when no operation input by the user is detected continues for longer
than a predetermined time (for example, 3 minutes) (Backlight not lit) etc. correspond. This can
reduce power consumption.
[0057]
The above-described embodiment can also be implemented in the following manner. (1) A
direction estimation unit that estimates a sound source direction based on acoustic signals of a
plurality of channels, a guidance direction determination unit that determines a guidance
direction based on the sound source direction estimated by the direction estimation unit, and the
guidance direction determination unit And a direction presenting unit for presenting the
determined guidance direction.
[0058]
(2) Of the plurality, the sound pickup units for collecting the acoustic signal for each channel are
arranged at different positions, and the direction estimation unit is configured to determine the
sound source direction based on the intensity ratio of the acoustic signal among the plurality of
channels. (1) electronic device characterized in that:
[0059]
(3) A tilt detection unit that detects a tilt from the horizontal direction, and a tilt determination
unit that determines whether the tilt detected by the tilt detection unit is larger than a
predetermined threshold for tilt The electronic device of (1) or (2).
11-04-2019
21
[0060]
(4) The guidance direction determination unit is characterized in that a direction changed by a
predetermined angle from the sound source direction estimated by the direction estimation unit
to a direction in which the sound source direction changes is defined as a guidance direction (3)
One of the electronic devices.
[0061]
(5) The direction presenting unit includes a plurality of elements arranged at different positions,
and presents the signal in the order of being arranged in the guiding direction or the direction
closest to the guiding direction. An electronic device according to any one of (1) to (4).
[0062]
(6) In the information processing method in the electronic device, a direction estimation process
of estimating a sound source direction based on acoustic signals of a plurality of channels, and
determination of a guidance direction determining a guidance direction based on a sound source
direction estimated by the direction estimation process. An information processing method
comprising: a process; and a direction presentation process of presenting a guidance direction
determined in the guidance direction determination process.
[0063]
(7) A direction estimation procedure for estimating a sound source direction based on acoustic
signals of a plurality of channels in a computer of an electronic device, a guidance direction
determination procedure for determining a guidance direction based on a sound source direction
estimated in the direction estimation procedure, the guidance direction INFORMATION
PROCESSING PROGRAM FOR EXECUTING DIRECTION PRESENTATION PROCEDURE FOR
PRESENTING GUIDE DIRECTION DETERMINED IN DECISION PROCEDURE
[0064]
According to (1), (6) or (7) described above, it is possible to guide the user whose guidance
direction determined based on the sound source direction is presented, in the direction of
retracting from the obstacle generating the sound.
According to (3) described above, it is possible to estimate the sound source direction with a
11-04-2019
22
small amount of calculation by using different intensity ratios depending on the sound collection
unit.
According to (4) described above, the direction in which the moving obstacle can be effectively
evacuated is presented.
According to (5) described above, the guidance direction is presented without interfering with
the function of the electronic device.
[0065]
Note that some of the electronic devices 1, 2, and 3 in the above-described embodiment, for
example, the control unit 110 may be realized by a computer.
In that case, a program for realizing the control function may be recorded in a computer readable
recording medium, and the program recorded in the recording medium may be read and
executed by a computer system.
Here, the “computer system” is a computer system built in the electronic devices 1, 2, and 3
and includes an OS and hardware such as peripheral devices.
The term "computer-readable recording medium" refers to a storage medium such as a flexible
disk, a magneto-optical disk, a ROM, a portable medium such as a ROM or a CD-ROM, or a hard
disk built in a computer system. Furthermore, the “computer-readable recording medium” is
one that holds a program dynamically for a short time, like a communication line in the case of
transmitting a program via a network such as the Internet or a communication line such as a
telephone line. In this case, a volatile memory in a computer system serving as a server or a
client in that case may also include one that holds a program for a certain period of time. The
program may be for realizing a part of the functions described above, or may be realized in
combination with the program already recorded in the computer system. In addition, part or all
of the electronic devices 1, 2, and 3 in the above-described embodiment may be realized as an
integrated circuit such as a large scale integration (LSI). Each functional block of the electronic
devices 1, 2, 3 may be individually processorized, or part or all may be integrated and
processorized. Further, the method of circuit integration is not limited to LSI's, and
11-04-2019
23
implementation using dedicated circuitry or general purpose processors is also possible. In the
case where an integrated circuit technology comes out to replace LSI's as a result of the
advancement of semiconductor technology, integrated circuits based on such technology may
also be used.
[0066]
As mentioned above, although one embodiment of this invention was described in detail with
reference to drawings, a specific structure is not restricted to the above-mentioned thing, Various
design changes etc. in the range which does not deviate from the summary of this invention It is
possible to
[0067]
1, 2, 3, electronic device, 101, housing, 102, display portion, 103 (103-1 to 103-N), sound
collecting portion, 104 (104-1 to 104-M1), light emitting portion, 105 (105) 105-1 to 105-M2)
... Vibration unit, 106 ... Acceleration sensor, 110 ... Control unit, 111, 211 ... Direction estimation
unit, 113 ... Guidance information generation unit, 114 ... Inclination detection unit
11-04-2019
24
Документ
Категория
Без категории
Просмотров
0
Размер файла
40 Кб
Теги
jp2014175932, description
1/--страниц
Пожаловаться на содержимое документа