close

Вход

Забыли?

вход по аккаунту

?

DESCRIPTION JP2013236130

код для вставкиСкачать
Patent Translate
Powered by EPO and Google
Notice
This translation is machine-generated. It cannot be guaranteed that it is intelligible, accurate,
complete, reliable or fit for specific purposes. Critical decisions, such as commercially relevant or
financial decisions, should not be based on machine-translation output.
DESCRIPTION JP2013236130
Abstract: To guide a position of an ear to a suitable position when using an electronic device that
transmits air conduction noise and vibration noise. An electronic apparatus (mobile phone) 1A
includes a piezoelectric element 7 and a sound generation unit (panel that vibrates by the
piezoelectric element 7 to generate air-conducted sound and vibration sound transmitted by
vibrating a part of a human body. And 20). The electronic device 1A performs notification to
notify the user of a specific position in the sound generation unit 20. The above-mentioned
specific position is, for example, a position where it is recommended to apply a part of the human
body. [Selected figure] Figure 1
Electronic device, control method, and control program
[0001]
The present application relates to an electronic device, a control method, and a control program.
[0002]
Patent Document 1 describes an electronic device that transmits air conduction noise and
vibration noise to a user.
Patent Document 1 describes that when a voltage is applied to a piezoelectric element of a
vibrating body disposed on the outer surface of a housing of an electronic device, the vibrating
body flexes and vibrates due to expansion and contraction of the piezoelectric element.
11-04-2019
1
Moreover, it is described in patent document 1 that the air conduction sound and the vibration
sound are transmitted to the user when the user makes the pinnae vibrate with the vibrator
vibrating in bending. According to Patent Document 1, the air conduction sound is a sound
transmitted to the user's auditory nerve when the vibration of air caused by the vibration of the
object is transmitted to the tympanic membrane through the ear canal and the tympanic
membrane vibrates. Further, according to Patent Document 1, the vibration sound is a sound
transmitted to the user's auditory nerve through a part of the user's body (for example, the
cartilage of the outer ear) contacting the vibrating object.
[0003]
JP 2005-348193 A
[0004]
By the way, in general, it is preferable for the user of the electronic device to know the position
where the user can easily hear the sound.
[0005]
An electronic device according to one aspect is an electronic device including: a piezoelectric
element; and a sound generation unit that vibrates by the piezoelectric element to generate airconducted sound and vibration sound transmitted by causing a part of a human body to vibrate
is there.
The electronic device performs notification to notify a user of a specific position in the sound
generation unit.
[0006]
A method according to one aspect is a control method executed by an electronic device including
a sound generation unit and a piezoelectric element, and informing a user of notifying a specific
position in the sound generation unit, the piezoelectric element And vibrating the sound
generation unit to generate an air conduction sound and a vibration sound transmitted by
vibrating a part of the human body.
[0007]
11-04-2019
2
The control program according to one aspect comprises: informing an electronic device including
a panel and a piezoelectric element of notifying a user of a specific position in the sound
generation unit; and vibrating the sound generation unit by the piezoelectric element Thus, the
steps of generating the air conduction sound and the vibration sound transmitted by vibrating a
part of the human body are performed.
[0008]
FIG. 1 is a front view of a mobile phone according to the embodiment.
FIG. 2 is a cross-sectional view of the mobile phone according to the embodiment.
FIG. 3 is a view showing an example of the shape of a panel.
FIG. 4 is a diagram showing an example of the vibration of the panel. FIG. 5 is a block diagram of
a mobile phone according to the embodiment. FIG. 6A is a diagram for explaining the induction
performed before the contact. FIG. 6B is a diagram for explaining the induction performed before
the contact. FIG. 6C is a view for explaining the induction performed before the contact. FIG. 7A
is a diagram for describing detection of the position of the ear. FIG. 7B is a diagram for
describing detection of the position of the ear. FIG. 7C is a diagram for describing detection of
the position of the ear. FIG. 8 is a figure for demonstrating the induction | guidance | derivation
performed after a contact. FIG. 9A is a diagram showing an example of control change. FIG. 9B is
a diagram showing an example of control change. FIG. 10 is a flowchart showing the procedure
of control for guiding the position of the ear to the standard position. FIG. 11 is a flowchart of the
control procedure at the time of call reception. FIG. 12 is a front view of a mobile phone
according to another embodiment. FIG. 13 is a cross-sectional view of a mobile phone according
to another embodiment. FIG. 14 is a diagram for describing detection of the position of the ear.
FIG. 15 is a diagram for explaining the induction performed before the contact. FIG. 16 is a front
view of a mobile phone according to another embodiment. FIG. 17 is a cross-sectional view of a
mobile phone according to another embodiment. FIG. 18 is a diagram showing an example of the
resonant frequency of the panel. FIG. 19 is a diagram for describing detection of the position of
the ear. FIG. 20 is a diagram for explaining the induction performed before the contact. FIG. 21 is
a diagram showing an example in which an image simulating the shape of the ear is displayed on
the display.
11-04-2019
3
[0009]
Embodiments for carrying out the present invention will be described in detail with reference to
the drawings. Hereinafter, a mobile phone will be described as an example of an electronic device
that transmits air conduction noise and vibration noise to a user.
[0010]
First Embodiment The overall configuration of a mobile phone 1A according to an embodiment
will be described with reference to FIGS. 1 and 2. FIG. 1 is a front view of the mobile phone 1A.
FIG. 2 is a cross-sectional view schematically showing the a-a cross section of the mobile phone
1A. As shown in FIGS. 1 and 2, the mobile phone 1A includes a display 2, a button 3, an
illuminance sensor 4, a proximity sensor 5, a piezoelectric element 7, a microphone 8, a camera
12, and a panel 20. And a housing 40.
[0011]
The display 2 includes a display device such as a liquid crystal display (LCD: Liquid Crystal
Display), an organic EL display (OELD: Organic Electro-Luminescence Display), or an inorganic EL
display (IELD: Inorganic Electro-Luminescence Display). The display 2 displays characters,
images, symbols, figures, and the like.
[0012]
The button 3 receives an operation input from the user. The number of buttons 3 is not limited to
the example shown in FIGS. 1 and 2.
[0013]
The illuminance sensor 4 detects the illuminance of ambient light of the mobile phone 1A. The
illuminance indicates the light intensity, brightness or luminance. The illumination sensor 4 is
used, for example, to adjust the brightness of the display 2. The proximity sensor 5 detects the
11-04-2019
4
presence of a nearby object without contact. The proximity sensor 5 detects the presence of an
object based on the change of the magnetic field, the change of the feedback time of the reflected
wave of the ultrasonic wave, or the like. The proximity sensor 5 detects, for example, that the
display 2 is brought close to the face. The illuminance sensor 4 and the proximity sensor 5 may
be configured as one sensor. The illumination sensor 4 may be used as a proximity sensor.
[0014]
When an electric signal (voltage according to a sound signal) is applied, the piezoelectric element
7 expands and contracts or bends according to the electromechanical coupling coefficient of the
constituent material. That is, the piezoelectric element 7 deforms when an electric signal is
applied. The piezoelectric element 7 is attached to the panel 20 and used as a vibration source
for vibrating the panel 20. The piezoelectric element 7 is formed, for example, using ceramic or
quartz. The piezoelectric element 7 may be a unimorph, a bimorph, or a laminated piezoelectric
element. The laminated piezoelectric element includes a laminated bimorph element in which
bimorphs are laminated (for example, 16 layers or 24 layers are laminated). The laminated
piezoelectric element is formed of, for example, a laminated structure of a plurality of dielectric
layers made of PZT (lead zirconate titanate) and an electrode layer disposed between the
plurality of dielectric layers. Unimorphs expand and contract when an electrical signal (voltage)
is applied. The bimorph bends when an electrical signal (voltage) is applied.
[0015]
The microphone 8 is a sound input unit. The microphone 8 converts the input sound into an
electrical signal. The speaker 11 is a sound output unit that outputs sound in an air conduction
system. The speaker 11 is, for example, a dynamic speaker, and can transmit the sound
converted from the electric signal to a person whose ear is not in contact with the mobile phone
1A. The speaker 11 is used, for example, to output music.
[0016]
The camera 12 is an in-camera that captures an object facing the display 2. The camera 12
converts the captured image into an electrical signal. In addition to the camera 12, the mobile
phone 1A may be equipped with an out camera that captures an object facing the opposite
surface of the display 2.
11-04-2019
5
[0017]
The panel 20 vibrates as the piezoelectric element 7 deforms (stretches or bends), and transmits
the vibration to ear cartilage (auricular cartilage) or the like which the user contacts the panel
20. The panel 20 also has a function of protecting the display 2 and the piezoelectric element 7
from external force. The panel 20 is formed of, for example, glass or a synthetic resin such as
acrylic. The shape of the panel 20 is, for example, a plate. The panel 20 may be a flat plate. The
panel 20 may be a curved panel whose surface is curved smoothly.
[0018]
The display 2 and the piezoelectric element 7 are attached to the back surface of the panel 20 by
the bonding member 30. The piezoelectric element 7 is spaced apart from the inner surface of
the housing 40 by a predetermined distance in a state of being disposed on the back surface of
the panel 20. The piezoelectric element 7 may be spaced apart from the inner surface of the
housing 60 even in a state of expansion and contraction or bending. That is, the distance
between the piezoelectric element 7 and the inner surface of the housing 40 may be larger than
the maximum amount of deformation of the piezoelectric element 7. The piezoelectric element 7
may be attached to the panel 20 via a reinforcing member (for example, a sheet metal or a glass
fiber reinforced resin). The bonding member 30 is, for example, a double-sided tape, or an
adhesive having thermosetting or ultraviolet curing properties. The bonding member 30 may be
an optical elastic resin which is a colorless and transparent acrylic ultraviolet curable adhesive.
[0019]
The display 2 is disposed substantially at the center in the lateral direction of the panel 20. The
piezoelectric element 7 is disposed near the longitudinal end of the panel 20 by a predetermined
distance so that the longitudinal direction of the piezoelectric element 7 is parallel to the short
direction of the panel 20. The display 2 and the piezoelectric element 7 are arranged in parallel
on the inner surface of the panel 20.
[0020]
11-04-2019
6
A touch screen (touch sensor) 21 is disposed on substantially the entire surface of the outer
surface of the panel 20. The touch screen 21 detects a touch on the panel 20. The touch screen
21 is used to detect a touch operation of the user by a finger, a pen, or a stylus pen or the like.
The gestures detected using the touch screen 21 include, but are not limited to, touch, long
touch, release, swipe, tap, double tap, long tap, drag, flick, pinch in and pinch out, for example.
The detection method of the touch screen 21 may be any method such as a capacitance method,
a resistive film method, a surface acoustic wave method (or ultrasonic method), an infrared
method, an electromagnetic induction method, and a load detection method.
[0021]
The touch screen 21 is also used to detect auricular cartilage or the like that contacts the panel
20 to hear a sound.
[0022]
The housing 40 is formed using a resin or a metal.
The housing 40 supports the button 3, the illuminance sensor 4, the proximity sensor 5, the
microphone 8, the speaker 11, the camera 12, the panel 20 and the like.
[0023]
The sound output by the mobile phone 1A according to the embodiment will be described in
more detail with reference to FIGS. 1 to 4. FIG. 3 is a view showing an example of the shape of
the panel 20. As shown in FIG. FIG. 4 is a diagram showing an example of the vibration of the
panel 20.
[0024]
An electrical signal corresponding to the sound to be output is applied to the piezoelectric
element 7. For example, ± 15 V, which is higher than ± 5 V that is a voltage applied to a socalled panel speaker that transmits sound by air conduction sound through the ear canal, may be
11-04-2019
7
applied to the piezoelectric element 7. Thereby, even if the user presses a part of his / her body
against the panel 20 with a force of 3 N or more (5 N to 10 N force, for example), sufficient
vibration is generated in the panel 20 and used Vibration sound can be generated that is
transmitted through a part of the person's body. The voltage applied to the piezoelectric element
7 can be appropriately adjusted according to the fixed strength of the panel 20 to the housing
40, the performance of the piezoelectric element 7, and the like.
[0025]
When an electrical signal is applied, the piezoelectric element 7 stretches or bends in the
longitudinal direction. The panel 20 to which the piezoelectric element 7 is attached deforms in
accordance with the expansion or contraction or bending of the piezoelectric element 7. Thereby,
the panel 20 vibrates to generate air conduction sound. Furthermore, when the user brings a part
of the body (e.g., auricular cartilage) into contact with the panel 20, the panel 20 generates an
oscillating sound that is conducted to the user through the part of the body. That is, the panel 20
vibrates at a frequency that is perceived as a vibration noise with respect to an object in contact
with the panel 20 as the piezoelectric element 7 is deformed.
[0026]
For example, when an electric signal corresponding to the voice of the other party in a call or
sound data such as a ringing tone or music is applied to the piezoelectric element 7, the panel 20
generates an air conduction sound and a vibration sound corresponding to the electric signal. Let
The sound signal output through the piezoelectric element 7 and the panel 20 may be based on
sound data stored in the storage 9 described later. The sound signal output via the piezoelectric
element 7 and the panel 20 may be stored in an external server or the like, and may be based on
sound data acquired via the network by the communication unit 6 described later.
[0027]
In this embodiment, the panel 20 may be approximately the same size as the user's ear. Also, as
shown in FIG. 3, the panel 20 may be larger than the user's ear. In this case, the user can contact
the panel 20 with substantially the entire outer periphery of the ear when listening to the sound.
By listening to the sound in this manner, ambient sound (noise) is less likely to enter the ear
canal. In the present embodiment, at least the panel 20 has a length in the longitudinal direction
11-04-2019
8
(or a lateral direction) corresponding to a distance from a human anti-annular lower leg (lower
leg) to an anti-tragus, and An area wider than an area having a length in the short direction (or
longitudinal direction) corresponding to the distance to the ear rings vibrates. The panel 20 has a
longitudinal (or short side) length corresponding to the distance from the part near the upper ear
ring (upper upper ring foot) to the earlobe in the ear ring, and the area near the opposite ear ring
in the trachea A region having a length in the latitudinal direction (or longitudinal direction)
corresponding to the distance to the site may vibrate. The region having the above length and
width may be a rectangular region, or an elliptical shape in which the length in the longitudinal
direction is the major axis and the length in the lateral direction is the minor axis. It is also good.
The average size of the human ear can be known, for example, by referring to the Japanese
Human Body Size Database (1992-1994) prepared by the Human Life Engineering Research
Center (HQL).
[0028]
As shown in FIG. 4, the panel 20 vibrates not only in the attachment area 20a where the
piezoelectric element 7 is attached, but also in the area away from the attachment area 20a. The
panel 20 has a plurality of points vibrating in a direction intersecting the main surface of the
panel 20 in the vibrating area, and the value of the vibration amplitude changes from positive to
negative with time in each of the plurality of points. Or it changes in the opposite way. At each
moment, the panel 20 vibrates in such a manner that the relatively large part of the vibration
amplitude and the relatively small part of the vibration seem to be randomly or regularly
distributed over substantially the entire panel 20. That is, vibration of a plurality of waves is
detected over the entire area of the panel 20. As described above, if the voltage applied to the
piezoelectric element 7 is ± 15 V, even if the user presses the panel 20 against the body with a
force of 5 N to 10 N, for example, the above description of the panel 20 Vibration is difficult to
dampen. Therefore, even if the user brings his ear into contact with the area on the panel 20
away from the mounting area 20a, he can hear the vibration noise.
[0029]
In the present embodiment, the display 2 is attached to the panel 20. Therefore, the rigidity of
the lower portion of the panel 20 (the side on which the display 2 is attached) is increased, and
the vibration is smaller than the upper portion of the panel 20 (the side on which the
piezoelectric element 7 is attached). For this reason, in the lower part of the panel 20, the sound
leakage of air conduction sound due to the vibration of the panel 20 is reduced.
11-04-2019
9
[0030]
The mobile phone 1A can transmit the air conduction sound and the vibration sound through a
part of the user's body (for example, the auricle cartilage) to the user by the vibration of the
panel 20. Therefore, in the case where the mobile phone 1A outputs a sound with a volume equal
to that of the dynamic receiver, the sound transmitted to the periphery of the mobile phone 1A
by air vibrations may be reduced compared to the electronic device having only the dynamic
speaker. it can. Such a feature is suitable, for example, when listening to a recorded message at a
place where others are nearby, such as in a train.
[0031]
Furthermore, the mobile phone 1A transmits the vibration sound to the user by the vibration of
the panel 20. Therefore, even if the user wears the earphone or the headphone, the user can hear
the vibration sound due to the vibration of the panel 20 through the earphone or the headphone
and a part of the body by bringing the mobile phone 1A into contact with them. it can.
[0032]
Furthermore, the mobile phone 1A transmits a sound by the vibration of the panel 20. Therefore,
when the mobile phone 1A is not separately provided with a dynamic receiver, it is not necessary
to form an opening (sound emission port) in the housing 40 for transmitting the sound emitted
by the panel 20 to the outside. For this reason, when realizing a waterproof structure, the
structure can be simplified. When it is necessary to form an opening such as a sound emission
port of a dynamic speaker in the housing 40, the mobile phone 1A closes the opening by a
member through which gas can pass but liquid can not pass in order to realize a waterproof
structure. A structure may be adopted. The part that allows the passage of gas but not the liquid
is, for example, Gore-Tex®.
[0033]
The functional configuration of the mobile phone 1A will be described with reference to FIG. FIG.
5 is a block diagram of the mobile phone 1A. As shown in FIG. 5, the mobile phone 1A includes a
11-04-2019
10
display 2, a button 3, an illuminance sensor 4, a proximity sensor 5, a communication unit 6, a
piezoelectric element 7, a microphone 8, a storage 9, and a controller 10. , A speaker 12, a
camera 12, a posture detection unit 15, a vibrator 18, and a touch screen 21.
[0034]
The communication unit 6 communicates by radio. The communication scheme supported by the
communication unit 6 is a wireless communication standard. As wireless communication
standards, there are, for example, communication standards for cellular phones such as 2G, 3G,
4G, and the like. As communication standards for cellular phones, for example, Long Term
Evolution (LTE), Wideband Code Division Multiple Access (W-CDMA), CDMA2000, Personal
Digital Cellular (PDC), GSM (Global System for Mobile Communications), PHS (Personal Handyphone System) and the like. As wireless communication standards, for example, there are WiMAX
(Worldwide Interoperability for Microwave Access), IEEE 802.11, Bluetooth (registered
trademark), IrDA (Infrared Data Association), NFC (Near Field Communication), and the like.
Communication unit 6 may support one or more of the communication standards described
above.
[0035]
The storage 9 stores programs and data. The storage 9 is also used as a work area for
temporarily storing the processing result of the controller 10. The storage 9 may include any
non-transitory storage medium such as a semiconductor storage medium and a magnetic storage
medium. The storage 9 may include multiple types of storage media. The storage 9 may include a
combination of a portable storage medium such as a memory card, an optical disc, or a magnetooptical disc and a reader of the storage medium. The storage 9 may include a storage device used
as a temporary storage area such as a random access memory (RAM).
[0036]
The programs stored in the storage 9 include an application executed in the foreground or
background and a control program for supporting the operation of the application. The
application causes the display 10 to display a screen, for example, and causes the controller 10
to execute processing in accordance with the gesture detected by the touch screen 21. The
control program is, for example, an OS. The application and control program may be installed in
11-04-2019
11
the storage 9 via wireless communication by the communication unit 6 or a non-transitory
storage medium.
[0037]
The storage 9 stores, for example, a control program 9A, a call application 9B, a music
reproduction application 9C, a moving image reproduction application 9D, and setting data 9Z.
The call application 9B provides a call function for a call by wireless communication. The music
reproduction application 9C provides a music reproduction function for reproducing a sound
from music data. The video playback application 9D provides a video playback function for
playing back video and sound from video data. The setting data 9Z includes information on
various settings related to the operation of the mobile phone 1A.
[0038]
The control program 9A provides functions related to various controls for operating the mobile
phone 1A. The control program 9A realizes a call by, for example, controlling the communication
unit 6, the piezoelectric element 7, the microphone 8, and the like. The functions provided by the
control program 9A include a function of performing control for guiding the position of the ear
touching the panel 20 to a suitable position to hear the sound. The function provided by the
control program 9A may be used in combination with the function provided by another program
such as the call application 9B.
[0039]
The controller 10 is an arithmetic processing unit. The arithmetic processing apparatus includes,
for example, a central processing unit (CPU), a system-on-a-chip (SoC), a micro control unit
(MCU), and a field-programmable gate array (FPGA), but is not limited thereto. The controller 10
comprehensively controls the operation of the mobile phone 1A to realize various functions.
[0040]
Specifically, the controller 10 executes an instruction included in the program stored in the
11-04-2019
12
storage 9 while referring to the data stored in the storage 9 as necessary. Then, the controller 10
controls the function unit according to the data and the instruction, thereby realizing various
functions. The functional unit includes, for example, the display 2, the communication unit 6, the
piezoelectric element 7, the microphone 8, the speaker 11, and the vibrator 18, but is not limited
thereto. The controller 10 may change the control according to the detection result of the
detection unit. The detection unit includes, for example, the button 3, the illuminance sensor 4,
the proximity sensor 5, the camera 12, the posture detection unit 15, and the touch screen 21,
but is not limited thereto.
[0041]
The controller 10 executes, for example, the control program 9A to execute control for guiding
the position of the ear touching the panel 20 to a suitable position to hear the sound.
[0042]
The attitude detection unit 15 detects the attitude of the mobile phone 1A.
The attitude detection unit 15 includes at least one of an acceleration sensor, an orientation
sensor, and a gyroscope to detect an attitude. The vibrator 18 vibrates part or all of the mobile
phone 1A. The vibrator 18 has, for example, a piezoelectric element or an eccentric motor to
generate vibration. The vibration by the vibrator 18 is not used to transmit a sound but to notify
the user of various events such as an incoming call.
[0043]
Some or all of the programs and data stored in the storage 9 in FIG. 5 may be downloaded from
another device by wireless communication by the communication unit 6. Some or all of the
programs and data stored in the storage 9 in FIG. 5 may be stored in a non-transitory storage
medium readable by the reader included in the storage 9. Non-transitory storage media include,
for example, optical disks such as CD (registered trademark), DVD (registered trademark), Blu-ray
(registered trademark), magneto-optical disks, magnetic storage media, memory cards, and solidstate storage media Including, but not limited to.
[0044]
11-04-2019
13
The configuration of the mobile phone 1A shown in FIG. 5 is an example, and may be changed as
appropriate without departing from the scope of the present invention. For example, the mobile
phone 1A may have a button such as a ten key array or a QWERTY array as a button for
operation.
[0045]
Control for guiding the position of the ear contacting the panel 20 to the standard position to
hear the sound will be described with reference to FIGS. 6A to 9. The standard position is a
position on the panel 20 where the user can suitably hear the vibration noise. 6A to 6C are
diagrams for explaining the guidance performed before the ear contacts the panel 20. FIG. 7A to
7C are diagrams for describing detection of the position of the ear in contact with the panel 20.
FIG. FIG. 8 is a view for explaining the guidance performed after the ear contacts the panel 20. As
shown in FIG. 9A and 9B are diagrams showing examples of control changes.
[0046]
The control for guiding the position of the ear to the standard position includes control
performed before the ear contacts the panel 20 and control performed after the ear contacts the
panel 20. The control performed before the ear contacts the panel 20 is described above.
[0047]
At a stage before the ear contacts the panel 20, the mobile phone 1A displays information on the
display 2 for guiding the ear attempting to contact the panel 20 to the standard position as
shown in FIGS. 6A to 6C. . At this stage, the mobile phone 1A may or may not vibrate the panel
20. That is, the mobile phone 1A may start applying an electrical signal to the piezoelectric
element 7 before the ear contacts the panel 20, or may begin applying an electrical signal to the
piezoelectric element 7 after the ear contacts the panel 20. .
[0048]
11-04-2019
14
As shown in FIG. 6A, the mobile phone 1A applies an ear within the frame 61 and a frame 61 of
substantially the same size as shown in FIG. 6A in order to guide the ear to be in contact with the
panel 20 to a standard position. May be displayed on the display 2. Alternatively, as shown in
FIGS. 6B and 6C, the mobile phone 1A displays a symbol 63 or 65 corresponding to a
predetermined part of the ear and a message 64 or 66 prompting the part to match the symbol
63 or 65. It may be displayed at 2. The portion of the ear corresponding to the symbol may be a
portion where the user can easily align the position with the symbol, such as the upper end of the
ear, or even a portion where the panel 20 easily contacts like a tragus Good.
[0049]
As described above, by displaying information for guiding the ear to the standard position on the
display 2, the user can easily grasp where on the panel 20 the ear is in contact with which it is
easy to hear the sound. . The information for guiding the ear to the standard position is not
limited to the examples shown in FIGS. 6A to 6C, and may be any information that allows the
user to understand where on the panel 20 the ear should be brought into contact. The mobile
phone 1A may display, for example, the shape of the ear as shown in FIG. 3 on the display 2 as
information for guiding the ear to the standard position.
[0050]
In the present embodiment, the mobile phone 1A learns the standard position based on the
position of the ear when the user uses the mobile phone 1A. When the user touches the panel 20
with his ear to hear vibration sound, it is considered that the user adjusts the position of the ear
on the panel 20 so that the user can easily hear the sound while listening to the sound.
Therefore, while the ear is in contact with the panel 20, the mobile phone 1A tracks the position
of the ear and determines the standard position based on the tracking result. For example, while
applying an electrical signal to the piezoelectric element 7, if the ear remains in contact with the
panel 20 and stays at the same position for a longer period of time while the mobile phone 1A
applies an electrical signal, the position is determined as the standard position. . This position
may be a specific position within the area where ear contact is detected. The mobile phone 1A
may determine that the area where the contact of the ear is detected does not change for a
certain period of time while applying the electrical signal to the piezoelectric element 7 as the
standard area. For example, the mobile phone 1A may determine the standard position based on
the information of a specific area of an image (described later) obtained by the detection result of
the touch screen 21 when an object is touched. The determined standard position is stored in
11-04-2019
15
setting data 9Z.
[0051]
In order to reduce the load on the mobile phone 1A, the determination of the standard position
may be performed only for a predetermined period after the start of use of the mobile phone 1A,
or may be performed periodically with a predetermined interval. Good. Alternatively, the mobile
phone 1A does not determine the standard position at the time of normal use, but performs
calibration processing to determine the standard position when the use of the mobile phone 1A is
started or when instructed by the user. Good. In the calibration process, while applying an
electrical signal to the piezoelectric element 7 to vibrate the panel 20, the cellular phone 1A
brings the user into contact with the panel 20 so that the ear can rest at a position where sound
can be easily heard. A prompt message is displayed on the display 2, and determination of the
standard position is started. Alternatively, the standard position may be fixedly set in advance so
that most people can easily hear the sound.
[0052]
The mobile phone 1A detects the position of the ear in contact with the panel 20 using the touch
screen 21. For example, as shown in FIG. 7A, the mobile phone 1A detects a position 72
determined based on the area 71 where the touch of the ear is detected by the touch screen 21
as the position of the ear. The position 72 is, for example, the center (center of gravity) of the
area 71. Position 72 may be any of the vertices of the smallest rectangle that includes area 71.
The position 72 may be a position corresponding to a predetermined part of the ear. In this case,
the position 72 is calculated from the relative positional relationship with the area 71 based on
the information on the general position of the part in the ear.
[0053]
This scheme makes it possible to detect the position of the ear touching the panel 20 without
performing complicated calculations. Furthermore, this method can also be applied when the
number of points at which the touch screen 21 can simultaneously detect a touch on the panel
20 is small.
11-04-2019
16
[0054]
Alternatively, as shown in FIG. 7B, the mobile phone 1A detects the position of the ear by pattern
matching between the image 73 obtained based on the detection result of the touch screen 21
when an object comes in contact with the sample 74 prepared in advance. Do. The image 73 is
obtained by dividing the detection area of the touch screen 21 into a grid and converting the
detection state of the touch of the object in each of the divided areas into the state of the
corresponding pixel. When the value detected by the touch screen 21 in each region fluctuates
due to, for example, the distance between the touch screen 21 and an object, or the pressure with
which the object presses the touch screen 21, the image 73 is a multi-tone image. It is also good.
[0055]
The specimen 74 is an image that should be obtained in the same manner as the image 73 in the
area in which the ears are in contact when the ears are in contact. The specimen 74 may be an
image that should be obtained when the user of the mobile phone 1A touches the ear, or may be
an image that should be obtained when the general person's ear contacts. A plurality of samples
74 may be prepared, such as an image of the right ear and an image of the left ear.
[0056]
The specimen 74 includes a reference position 74a corresponding to a predetermined part of the
ear. The reference position 74a is located at (x1, y1) with the upper left of the sample 74 as a
reference. The reference position 74a may be set based on information on the position of the
part in the general human ear. If the specimen 74 is an image actually obtained when the user of
the mobile phone 1A touches the ear, the reference position 74a may be set by analyzing the
image.
[0057]
When the image 73 is obtained, the mobile phone 1A obtains the relative position of the image
73 and the sample 74 when the image 74 and the sample 74 are most matched by pattern
matching. When it is determined by pattern matching that the image 73 and the sample 74 do
not match (for example, when the matching degree is lower than the threshold), the mobile
11-04-2019
17
phone 1A determines that ear contact is not detected. It is also good. When the relative position
is obtained, the mobile phone 1A calculates the position of the ear based on the relative position
and the reference position 74a. In the case of the example of FIG. 7B, when the sample 74 is
shifted by x2 in the X-axis direction and y2 in the Y-axis direction with reference to the upper left
of the image 73, both match best. In this case, the position of the ear is calculated as (x1 + x2, y1
+ y2).
[0058]
This method makes it possible to accurately detect the position of the ear in contact with the
panel 20. Furthermore, this method determines whether the object contacting the panel 20 is an
ear or whether the object contacting the panel 20 is a pre-registered person's ear by matching
with the sample. Make it possible. Furthermore, this scheme makes it possible to detect detailed
information on ear contact, such as ear orientation and tilt.
[0059]
When detecting the position of the ear by pattern matching, the sample may not include the
reference position. In the example shown in FIG. 7C, the image 73 is pattern matched with the
sample 75 not including the reference position. In the case of the example of FIG. 7C, when the
sample 75 is shifted by x3 in the X-axis direction and y3 in the Y-axis direction with reference to
the upper left of the image 73, both match best. In this case, the position of the ear is calculated,
for example, as (x3, y3).
[0060]
This scheme does not include the reference position, thus facilitating sample preparation. For
example, when the movement amount and movement direction of the position of the ear are
necessary and it is not necessary to specify the position of a specific part of the ear, this method
is necessary to obtain necessary information without setting the reference position on the
sample. Can.
[0061]
The method of detecting the position of the ear in contact with the panel 20 using the touch
11-04-2019
18
screen 21 is not limited to the method described above, and other methods may be adopted.
[0062]
The mobile phone 1A may be configured to guide the inclination of the ear contacting the panel
20 to approach a suitable inclination registered in advance using a method based on pattern
matching.
[0063]
The setting data 9Z stores information indicating a standard position determined based on the
position of the ear on the panel 20 detected by any of the above methods or other methods.
The mobile phone 1A adjusts the position for displaying the information for guiding the ear to
the standard position according to what the position of the ear is detected with reference to.
For example, it is assumed that the position of the ear is detected for pattern matching using a
sample including the position corresponding to the earlobe as a reference position. In this case,
the frame 61 shown in FIG. 6A is displayed so as to surround the area where the user or a
general person's ear contacts when the tragus is located at the standard position, and the symbol
63 shown in FIG. The symbol 65 displayed at the top of the range and shown in FIG. 6C is
displayed at the standard position.
[0064]
Subsequently, in order to guide the position of the ear to the standard position, control
performed after the ear touches the panel 20 will be described. When touch of the ear on the
panel 20 is detected by the touch screen 21, the mobile phone 1A detects the position where the
ear is in contact with the panel 20 by any of the above-described methods or other methods.
Then, the mobile phone 1A performs control to bring the position of the ear closer to the
standard position.
[0065]
11-04-2019
19
When the ear is in contact with the panel 20, the display 2 is located on the side of the face. In
this case, it is not easy for the user to view the information displayed on the display 2. Therefore,
the cellular phone 1A guides the position of the ear to approach the standard position by
controlling the electric signal applied to the piezoelectric element.
[0066]
For example, as shown in FIG. It is assumed that the standard position is the position 81 on the
panel 20, and the position of the ear detected using the touch screen 21 is the position 82. In
this case, as shown in FIG. 9A, the mobile phone 1A controls the electric signal applied to the
piezoelectric element so that the volume becomes lower as the distance between the position 81
and the position 82 becomes longer. That is, the mobile phone 1A changes the control of the
piezoelectric element 7 so that the volume decreases as the distance between the position 81 and
the position 82 increases.
[0067]
According to such control, the user moves the cellular phone 1A while keeping the ear in contact
with the panel 20 to easily determine whether the position of the ear is approaching or away
from the standard position. , The position of the ear can be close to the standard position
Furthermore, according to this control, since the sound is output at the original volume as the ear
position approaches the standard position, the volume does not change rapidly in the process of
the ear position approaching the standard position. Can provide a natural and intuitive feeling of
operation.
[0068]
Although FIG. 9A shows an example in which control of the piezoelectric element 7 is changed
according to the distance between the position 81 and the position 82, the mobile phone 1A
further adds the direction of the other position viewed from one position to the condition. The
control of the piezoelectric element 7 may be changed. Although FIG. 9A shows an example of
changing the volume of the sound to be output according to the distance between the position 81
and the position 82, the mobile phone 1A has the sound quality of the sound to be output
according to the distance between the position 81 and the position 82. May be changed. For
11-04-2019
20
example, as shown in FIG. 9B, the mobile phone 1A changes the level of the high-pitched part of
the sound to be output according to the distance between the position 81 and the position 82 in
the X-axis direction. The level of the bass portion of the output sound may be changed according
to the distance of. According to such control, the user can easily grasp in which direction the
position of the ear approaches the standard position if the mobile phone 1A is moved.
[0069]
A procedure of control for guiding the position of the ear contacting the panel 20 to the standard
position to hear the sound will be described with reference to FIGS. 10 and 11. FIG. 10 is a
flowchart showing the procedure of control for guiding the position of the ear to the standard
position. FIG. 11 is a flowchart of the control procedure at the time of call reception. The
processing procedure shown in FIGS. 10 and 11 is realized by the controller 10 executing the
control program 9A.
[0070]
When the panel 20 is vibrated and sound output is started, or when it is ready to vibrate the
panel 20 and sound output started (step S101), the controller 10 controls the object in contact
with the panel 20. It is determined based on the detection result of the touch screen 21 (step
S102). If there is no object in contact with the panel 20 (step S102, No), the controller 10 causes
the display 2 to display a guide indicating the position where the ear is in contact, as shown in
FIGS. 6A to 6C (step S103). ). The display of the guide is continued until an object in contact with
the panel 20 is detected.
[0071]
When there is an object in contact with the panel 20 (Yes in step S102), the controller 10
acquires information indicating a standard position from the setting data 9Z stored in the storage
9 (step S104).
[0072]
When there is an object in contact with the panel 20, the controller 10 may turn off the display 2
to reduce power consumption.
11-04-2019
21
When the control to turn off the display 2 when the proximity sensor 5 detects the approach of
the face to the display 2 is implemented in another processing procedure, the control to turn off
the display 2 in this execution procedure You do not have to run it. Generally, in a mobile phone
such as a smartphone, in addition to the control for turning off the display 2 when the proximity
sensor 5 detects the approach of the face to the panel 20, the control for disabling the touch
screen 21 for preventing erroneous operation is To be done. In this embodiment, since the
position of the ear in contact with the panel 20 is detected using the touch screen 21, even when
the proximity sensor 5 detects the approach of the face to the panel 20, the touch screen 21 is Is
left active.
[0073]
Subsequently, the controller 10 calculates the position of the object in contact with the panel 20
based on the detection result of the touch screen 21 (step S105). Then, the controller 10
calculates the deviation between the calculated position and the standard position (step S106).
The calculated deviation may include only the value for the distance, or may include the value for
the distance and the value for the direction.
[0074]
Subsequently, the controller 10 changes the control of the guiding means in accordance with the
calculated deviation (step S107). In the present embodiment, the induction means is the
piezoelectric element 7, and the change of the control of the induction means is performed, for
example, as shown in FIG. 9A or 9B. Then, the controller 10 records the position of the object in
contact with the panel 20 (step S108), and determines whether to end the output of the sound of
the panel 20 (step S109). When the output of the sound is ended, for example, the operation for
ending the output of the sound by the user is detected, and the process for outputting the sound
such as the call, the music reproduction, the moving image reproduction is completed. The case
is included.
[0075]
When the output of the sound is not ended (Step S109, No), the controller 10 re-executes Step
11-04-2019
22
S105 and subsequent steps. When the output of the sound is ended (Step S109, Yes), the
controller 10 calculates the standard position based on the recorded position, and updates the
standard position stored in the setting data 9Z if necessary (Step S110). If the standard position
stored in the setting data 9Z needs updating, for example, if the ear stays longer than a
predetermined time at a position other than the standard position, or after updating the standard
position last, it is longer than a predetermined time It is when the period has passed.
[0076]
In order to reduce power consumption, the controller 10 may interrupt the loop from step S105
to step S109 and execute step S110 when the deviation becomes smaller than the threshold. The
controller 10 may interrupt the loop from step S105 to step S109 and execute step S110 when a
predetermined time has elapsed since detection of the touch of the object on the panel 20. When
interrupting the loop, controller 10 may disable touch screen 21 to further reduce power
consumption. Even when the loop is interrupted, the output of sound through the panel 20 is
continued.
[0077]
The controller 10 may start processing relating to sound output in response to the difference
between the position of the ear and the standard position becoming smaller than a threshold. For
example, when there is an incoming call, the controller 10 starts to guide the position of the ear,
and when the deviation between the position of the ear and the standard position becomes
smaller than a threshold, the reception process is automatically started. You may The processing
procedure in this case will be described.
[0078]
As shown in FIG. 11, when a call arrives (step S201), the controller 10 starts outputting a ringing
tone to notify the user of the incoming call (step S202). At this stage, the output of the ringing
tone may be performed via the panel 20, may be performed via the speaker 11, or may be
performed via both. The controller 10 may use the vibration of the vibrator 18, the flashing of
the lamp, and the like in order to notify the user of the incoming call.
11-04-2019
23
[0079]
Subsequently, the controller 10 determines whether there is an object in contact with the panel
20 based on the detection result of the touch screen 21 (step S203). When there is no object in
contact with the panel 20 (step S203, No), the controller 10 causes the display 2 to display a
guide indicating the position where the ear is in contact, as shown in FIGS. 6A to 6C (step S204).
). The display of the guide is continued until an object in contact with the panel 20 is detected.
[0080]
If there is an object in contact with the panel 20 (step S203, Yes), the controller 10 acquires
information indicating a standard position from the setting data stored in the storage 9 (step
S205). At this stage, the controller 10 outputs a ringing tone at least through the panel 20. When
the ringing tone is output through the speaker 11, the controller 10 may stop outputting the
ringing tone through the speaker 11 at this stage. The control regarding the turning off of the
display 2 and the disabling of the touch screen 21 when there is an object in contact with the
panel 20 is the same as in the case of FIG.
[0081]
Subsequently, the controller 10 calculates the position of the object in contact with the panel 20
based on the detection result of the touch screen 21 (step S206). Then, the controller 10
calculates the deviation between the calculated position and the standard position (step S207).
The calculated deviation may include only the value for the distance, or may include the value for
the distance and the value for the direction.
[0082]
Subsequently, the controller 10 changes the control of the guiding means in accordance with the
calculated deviation (step S208). The control of the guiding means is similar to the control
already described. Then, the controller 10 determines whether the calculated deviation is less
than the threshold (step S209). If the deviation is not less than the threshold (No at step S209),
step S206 and subsequent steps are re-executed.
11-04-2019
24
[0083]
If the deviation is less than the threshold value, that is, if the user can hear the sound sufficiently
(Yes at Step S209), the controller 10 executes a receiving process (Step S210). The call reception
process is a process of starting a call in response to a signal indicating off-hook. When the call is
started, the other party's voice is transmitted to the user through the panel 20. By automatically
executing the call reception process at such a timing, the mobile phone 1A causes the user to
start a call without causing the user to perform an extra operation in a state in which the user
can easily hear the other party's voice. Can. The call is continued until an operation or the like for
disconnecting the call is performed on either side (step S211).
[0084]
Second Embodiment In the above embodiment, an example in which the touch screen 21 is
disposed on substantially the entire surface of the panel 20 has been described, but the touch
screen 21 may be disposed so as not to overlap the panel 20. FIG. 12 is a front view of the
mobile phone 1B disposed so that the touch screen 21 does not overlap the panel 20. FIG. 13 is a
cross-sectional view schematically showing the b-b cross section of the mobile phone 1B.
[0085]
As shown in FIGS. 12 and 13, in the mobile phone 1 </ b> B, the display 2 is disposed side by
side with the panel 20 so as not to be inside the panel 20 but to be flush with the panel 20. The
touch screen 21 is disposed to cover substantially the entire front surface of the display 2. That
is, the touch screen 21 and the display 2 constitute a so-called touch panel (touch screen
display).
[0086]
The piezoelectric element 7 is attached to a substantially central portion of the rear surface of
the panel 20 by a bonding member 30. When an electric signal is applied to the piezoelectric
element 7, the panel 20 vibrates according to the deformation (expansion or bending) of the
piezoelectric element 7, and the air conduction sound and a part of the human body contacting
11-04-2019
25
the panel 20 (e.g. Vibrational noise transmitted through the cartilage. By disposing the
piezoelectric element 7 at the center of the panel 20, the vibration of the piezoelectric element 7
is uniformly transmitted to the entire panel 20, and the quality of the air conduction noise and
the vibration noise is improved.
[0087]
Although the touch screen 21 is not disposed on the front surface of the panel 20, the panel 20
is disposed in the vicinity of the display 2 on which the touch screen 21 is disposed.
[0088]
When the user of the mobile phone 1B having such a configuration brings the ear into contact
with the panel 20 to hear vibration noise, the panel 20 is in the vicinity of the touch screen 21,
and thus part of the ear touches the touch screen 21. Do.
Therefore, the detection area of the touch screen 21 is divided into grids, and the detection state
of the ear contact in each of the divided areas is converted into the state of the corresponding
pixel, as shown in FIG. Is obtained.
[0089]
When the image 76 is obtained, the cellular phone 1B obtains the relative position of the image
76 and the sample 75 when the image 75 and the sample 75 are most matched by pattern
matching. In the case of the example of FIG. 14, when the sample 75 is shifted by x4 in the X-axis
direction and by -y4 in the Y-axis direction with reference to the upper left of the image 76, both
match best. In this case, the position of the ear is calculated as (x4, -y4). The mobile phone 1B
can also detect the position of the ear using the sample 74 including the reference position 74a.
[0090]
As described above, even when the touch screen 21 is disposed so as not to overlap the panel 20,
the mobile phone 1B can detect the position of the ear contacting the panel 20 using the touch
screen 21. Therefore, the mobile phone 1B can execute control for guiding the position of the ear
11-04-2019
26
to the standard position, similarly to the mobile phone 1A.
[0091]
However, in the mobile phone 1B, when displaying information for guiding the ear to be in
contact with the panel 20 to the standard position on the display 2, the position where the
information is displayed may be outside the display area of the display 2 There is. Therefore, as
shown in FIG. 15, the mobile phone 1B displays the entire view 67a of the mobile phone 1B and
the message 69a on the display 2, and arranges the symbol 68a on the entire view 67a.
[0092]
Third Embodiment In the above embodiment, an example is described in which at least a part of
the touch screen 21 is disposed so as to overlap the display 2. However, the touch screen 21 is
disposed so as not to overlap the display 2 May be FIG. 16 is a front view of the mobile phone 1C
disposed so that the touch screen 21 does not overlap the display 2. FIG. 17 is a cross-sectional
view schematically showing a c-c cross section of the mobile phone 1C.
[0093]
As shown in FIGS. 16 and 17, in the mobile phone 1 </ b> C, the display 2 is disposed side by
side with the panel 20 so as not to be inside the panel 20 but to be flush with the panel 20.
[0094]
The piezoelectric element 7 is attached to a substantially central portion of the rear surface of
the panel 20 by a bonding member 30.
A reinforcing member 31 is disposed between the panel 20 and the piezoelectric element 7. That
is, in the mobile phone 1 </ b> C, the piezoelectric element 7 and the reinforcing member 31 are
bonded by the bonding member 30, and the reinforcing member 31 and the panel 20 are bonded
by the bonding member 30.
11-04-2019
27
[0095]
The reinforcing member 31 is, for example, an elastic member such as rubber or silicon. The
reinforcing member 31 may be, for example, a metal plate made of aluminum or the like having a
certain degree of elasticity. The reinforcing member 31 may be, for example, a stainless steel
plate such as SUS304. The thickness of a metal plate such as a stainless steel plate is suitably 0.2
mm to 0.8 mm, for example, according to the voltage value applied to the piezoelectric element 7
and the like. The reinforcing member 31 may be, for example, a plate made of resin. As resin
which forms the board made of resin here, a polyamide-type resin is mentioned, for example. The
polyamide resin is, for example, a crystalline thermoplastic resin obtained from
metaxylylenediamine and adipic acid, and there is Reny (registered trademark) rich in strength
and elasticity. Such a polyamide resin may be a reinforced resin reinforced with glass fiber, metal
fiber, carbon fiber or the like as a base polymer itself. The strength and elasticity of the
reinforced resin are appropriately adjusted according to the amount of addition of glass fiber,
metal fiber, carbon fiber or the like to the polyamide resin. The reinforcing resin is formed, for
example, by impregnating a base material formed by knitting glass fiber, metal fiber, carbon fiber
or the like with the resin and curing it. The reinforcing resin may be formed by mixing a finely
cut fiber piece into a liquid resin and then curing it. The reinforced resin may be a laminate of a
base on which fibers are woven and a resin layer.
[0096]
By arranging the reinforcing member 31 between the piezoelectric element 7 and the panel 20,
the following effects can be obtained. When an external force is applied to the panel 20, the
possibility that the external force is transmitted to the piezoelectric element 7 and the
piezoelectric element 7 is broken can be reduced. For example, when an external force is applied
to the panel 20 by the cellular phone 1C falling to the ground, the external force is first
transmitted to the reinforcing member 31. Since the reinforcing member 31 has a predetermined
elasticity, it is elastically deformed by an external force transmitted from the panel 20. Therefore,
at least a part of the external force applied to the panel 20 is absorbed by the reinforcing
member 31, and the external force transmitted to the piezoelectric element 7 is reduced. As a
result, breakage of the piezoelectric element 7 can be reduced. When the reinforcing member 31
is disposed between the piezoelectric element 7 and the casing 40, the casing 40 is deformed by,
for example, the mobile phone 1C falling to the ground, and the deformed casing 40 collides with
the piezoelectric element 7 7 can be reduced.
[0097]
11-04-2019
28
Vibration due to expansion or contraction or bending of the piezoelectric element 7 is first
transmitted to the reinforcing member 31 and further transmitted to the panel 20. That is, the
piezoelectric element 7 first vibrates the reinforcing member 31 having a larger elastic
coefficient than the piezoelectric element 7 and further vibrates the panel 20. Therefore, as
compared with the structure in which the cellular phone 1C does not include the reinforcing
member 31 and the piezoelectric element 7 is joined to the panel 20 by the joining member 70,
the deformation of the piezoelectric element 7 can be less likely to be excessive. Thereby, the
amount of deformation (degree of deformation) of the panel 20 can be adjusted. This structure is
particularly effective in the case of the panel 20 in which the deformation of the piezoelectric
element 7 is not easily inhibited.
[0098]
Furthermore, by arranging the reinforcing member 31 between the piezoelectric element 7 and
the panel 20, as shown in FIG. 18, the resonant frequency of the panel 20 is lowered, and the
acoustic characteristics in the low frequency band are improved. FIG. 18 is a view showing an
example of change of frequency characteristics by the reinforcing member 31. As shown in FIG.
FIG. 18 shows frequency characteristics in the case of using the above-mentioned sheet metal
such as SUS304 as the reinforcing member 31 and frequency characteristics in the case of using
the reinforced resin such as the above-mentioned Reny as the reinforcing member 31. . The
horizontal axis represents frequency, and the vertical axis represents sound pressure. The
resonance point in the case of using a reinforced resin is about 2 kHz, and the resonance point in
the case of using a sheet metal is about 1 kHz. The dip is about 4 kHz when the reinforced resin
is used, and about 3 kHz when the sheet metal is used. That is, when the reinforced resin is used,
the resonance point of the panel 20 is located in a high frequency region and the dip of the
frequency characteristic is located in a higher frequency region as compared with the case where
a sheet metal is used. Since the frequency band used for voice communication of the mobile
phone is 300 Hz to 3.4 kHz, the dip can not be included in the operating frequency band of the
mobile phone 1C when the reinforced resin is used as the reinforcing member 31. . Even when a
sheet metal is used as the reinforcing member 31, the dip is not included in the operating
frequency band of the mobile phone 1C by appropriately adjusting the type or composition of the
metal constituting the sheet metal or the thickness of the sheet metal. be able to. When
comparing sheet metal and reinforced resin, the reinforced resin can reduce the impact on
antenna performance as compared to sheet metal. A reinforced resin is less likely to be plastically
deformed as compared to a sheet metal, and thus has the advantage that the acoustic
characteristics are less likely to change. The reinforced resin suppresses the temperature rise at
the time of sound generation as compared to the sheet metal. Instead of the reinforcing member
11-04-2019
29
31, a plate-like weight may be attached to the piezoelectric element 7 by the joining member 30.
[0099]
When an electric signal is applied to the piezoelectric element 7, the panel 20 vibrates according
to the deformation (expansion or bending) of the piezoelectric element 7, and the air conduction
sound and a part of the human body contacting the panel 20 (e.g. Vibrational noise transmitted
through the cartilage. The touch screen 21 is disposed to cover substantially the entire front
surface of the panel 20.
[0100]
When the user of the mobile phone 1C having such a configuration brings the ear into contact
with the panel 20 to hear vibration sound, although the touch screen 21 is smaller than the ear, a
part of the ear contacts the touch screen 21. Do. Therefore, the detection area of the touch
screen 21 is divided into grids, and the detection state of the contact of the ear in each of the
divided areas is converted into the state of the corresponding pixel, as shown in FIG. Is obtained.
[0101]
When the image 77 is obtained, the mobile phone 1C obtains the relative position of the image
77 and the sample 75 when the image 75 and the sample 75 are most matched by pattern
matching. In the case of the example of FIG. 19, when the sample 75 is shifted by x5 in the X-axis
direction and by -y5 in the Y-axis direction with reference to the upper left of the image 77, both
match best. In this case, the position of the ear is calculated as (x5, -y5). The mobile phone 1C
can also detect the position of the ear using the sample 74 including the reference position 74a.
[0102]
As described above, even when the touch screen 21 is disposed so as not to overlap the display
2, the mobile phone 1 </ b> C can detect the position of the ear contacting the panel 20 using
the touch screen 21. Therefore, the mobile phone 1C can execute control for guiding the position
of the ear to the standard position, similarly to the mobile phone 1A.
11-04-2019
30
[0103]
However, in the mobile phone 1C, when displaying information for guiding the ear that is about
to touch the panel 20 to the standard position on the display 2, the position where the
information is displayed may be outside the display area of the display 2 There is. Therefore, as
shown in FIG. 20, the mobile phone 1C displays the overall view 67b of the mobile phone 1C and
the message 69b on the display 2, and arranges the symbol 68b on the overall view 67b.
[0104]
Other Embodiments The disclosed embodiments of the present application can include matters
apparent to those skilled in the art, and can be modified without departing from the spirit and
scope of the invention. Furthermore, the disclosed embodiments of the present application and
their modifications can be combined as appropriate. For example, the above embodiment may be
modified as follows.
[0105]
For example, each program shown in FIG. 5 may be divided into a plurality of modules, or may be
combined with other programs.
[0106]
Although the above-mentioned embodiment showed an example which detects a position of an
object which contacts panel 20 using touch screen 21, a detection part which detects a position
of an object is not limited to touch screen 21. FIG.
For example, the detection unit that detects the position of the object may be the camera 12. In
this case, the position of the object is detected based on the image acquired by the camera 12.
[0107]
11-04-2019
31
In the above embodiment, an example in which the movement of an object in contact with the
panel 20 is detected using the touch screen 21 has been described, but the detection unit for
detecting the movement of the object is not limited to the touch screen 21. For example, the
movement of the object may be detected based on an image acquired by the camera 12 or may
be detected by an acceleration sensor that the posture detection unit 15 has.
[0108]
In the above embodiment, an example in which guidance is performed on the premise that the
object in contact with the panel 20 is an ear is shown, but the mobile phones 1A to 1C determine
whether the object in contact with the panel 20 is an ear And induction may be performed only if
it is the ear. By performing control in this manner, when the ear is in contact with the panel 20,
the position of the ear is guided, and when the finger is in contact with the panel 20, processing
according to the touch operation is performed. As a result, control can be switched according to
the situation. Whether or not the object touching the panel 20 is the ear can be determined, for
example, by increasing the accuracy of pattern matching with the sample.
[0109]
Mobile phones 1A to 1C may be configured to be compatible with a plurality of users. In this
case, the standard position is stored in the setting data 9Z for each user. Then, the mobile phones
1A to 1C determine which user's ear, which is an object registered in advance, which is in contact
with the panel 20 is registered in advance, and performs guidance to a standard position
corresponding to the determined user. It can be determined, for example, by preparing a sample
for each user, which user's ear is the object that contacts the panel 20. Alternatively, which user's
ear the object touching the panel 20 is can be determined by first determining the position at
which the user touches the ear at a different position for each user.
[0110]
Although the above embodiment shows an example of using the piezoelectric element 7 as a
guiding means for guiding to the standard position, the guiding means moves the ear to the
standard position while the user is in contact with the panel 20. Any means may be used if it is
possible. For example, the guiding means may be a vibrator 18 or a speaker 11. When the
11-04-2019
32
vibrator 18 is used as an induction means, the induction is realized by changing the strength or
frequency of the vibration by the vibrator 18. When the speaker 11 is used as a guiding means,
the guidance is realized using the sound output from the speaker 11. A plurality of induction
means may be used in combination to realize the induction.
[0111]
Although the above embodiment shows an example in which the detection position of the ear is
recorded to update the standard position while performing control for guiding to the standard
position, these operations are performed at different timings. It is also good. For example, the
mobile phones 1A to 1C perform control for guidance to a standard position until contact with an
object is detected on the panel 20 until a predetermined condition is satisfied, and then recording
of the detection position of the ear is started. It may be configured to The predetermined
condition is that the deviation between the position of the ear and the standard position is
smaller than a threshold or that a predetermined time has elapsed. In this way, it is possible to
record the detected position of the ear in order to update the standard position without being
affected by the change of control for guidance.
[0112]
Although the above-mentioned embodiment showed processing which performs guidance to a
standard position with a mobile phone, it is not limited to these. The mobile phone may notify of
the specific position by displaying information on the specific position on the display 2. For
example, the mobile phone may display on the display 2 information on a specific position
(optimum position) at which the vibration noise is most likely to be transmitted to the user. For
example, in the mobile phone 1A shown in FIG. 1, a region about 1 cm below the region where
the piezoelectric element 7 is attached (the region immediately above the piezoelectric element)
of the panel 20 is the optimum position. Vibration noise is most easily transmitted to the user
when part of the ear is placed at this optimal position. The optimal position may vary depending
on various parameters such as, for example, the dimensions of the panel 20, the material of the
panel 20, or the amplitude of vibration of the piezoelectric element 7.
[0113]
As described above, the region immediately above the piezoelectric element does not necessarily
11-04-2019
33
coincide with the optimum position where the vibration noise is most likely to be transmitted to
the user. This is due to the relationship between the position where the piezoelectric element is
attached on the panel and the position where the user's ear and the panel are in contact. In FIG.
1, the piezoelectric element 7 is disposed in the vicinity separated by a predetermined distance
from one end of the panel 20. The degree of deformation of the panel 20 due to the deformation
of the piezoelectric element 7 decreases as the area is farther from the area where the
piezoelectric element 7 is attached. That is, the vibration of the panel 20 is attenuated as it gets
farther from the area where the piezoelectric element 7 is attached. On the other hand, as the
position where the user's ear comes in contact with the panel is closer to the center of the panel,
the ear canal is blocked by the entire panel, and ambient noise (noise) entering the ear canal
decreases. Considering both the degree of vibration damping and the magnitude of the ambient
sound entering the ear canal, when the area of the panel 20 about 1 cm below the area directly
above the piezoelectric element of the panel 20 is placed in the ear The ratio (S / N ratio) of the
magnitude of the vibration noise to the noise is maximized.
[0114]
Then, as shown in FIG. 21, the mobile phone may display an image imitating the shape of the ear
on the display 2 as the information on the optimum position. The mobile phone displays an
image 85 imitating the shape of the ear on the display 2 such that the tragus, for example,
corresponds to the optimum position of the panel 20. The air conduction sound and the vibration
sound may be generated when the image 85 imitating the shape of the ear matches the touch
pattern detected by the touch screen.
[0115]
A specific position (optimum position) may be notified to the user by providing a projection
(convex portion) of a predetermined shape on the panel 20 or the housing 40. The projection is
provided, for example, at a position along the shell of the ear ring when the ear contacts the
optimum position, or at a position of the tragus when the ear contacts the optimum position. In
this case, the user can recognize the optimum position when a part of the ear touches the
projection. That is, the protrusion functions as one of the notification units of the present
application. In addition, a notch (concave part) may be formed in the panel 20 or the housing 40.
[0116]
11-04-2019
34
In the above embodiment, an example is shown in which the display 2 is attached to the back of
the panel 20 using the bonding member 30 as the mobile phone 1A, but the mobile phone 1A
has a space between the panel 20 and the display 2 May be configured. By providing a space
between the panel 20 and the display 2, the panel 20 easily vibrates, and the range over which
the vibration noise can be easily heard on the panel 20 becomes wide.
[0117]
Although the above-mentioned embodiment showed the example which attaches piezoelectric
element 7 to panel 20, it may be attached to other places. For example, the piezoelectric element
7 may be attached to the battery lid. The battery lid is a member that is attached to the housing
40 and covers the battery. Since the battery lid is often attached to a surface different from the
display 2 in a portable electronic device such as a mobile phone, according to such a
configuration, the user can use a part of the body (for example, an ear) on the surface different
from the display 2 You can hear the sound by making The piezoelectric element 7 may be
configured to vibrate the corners (for example, at least one of the four corners) of the housing
40. In this case, the piezoelectric element 7 may be configured to be attached to the inner surface
of the corner of the housing 40, or may further include an intermediate member, such that the
vibration of the piezoelectric element 7 is transmitted to the corner of the housing 40 via the
intermediate member. May be. According to this configuration, since the range of vibration can
be relatively narrowed, the air conduction noise generated by the vibration hardly leaks to the
surroundings. Further, according to this configuration, since the air conduction sound and the
vibration sound are transmitted to the user while the user inserts the corner of the housing into
the external ear canal, for example, ambient noise does not easily enter the user's external ear
canal. Therefore, the quality of the sound transmitted to the user can be improved.
[0118]
In the above embodiment, the reinforcing member 31 is a plate-like member, but the shape of the
reinforcing member 31 is not limited to this. For example, the reinforcing member 31 may be
larger than the piezoelectric element 7 and may have a shape in which an end thereof is curved
toward the piezoelectric element 7 and covers the side of the piezoelectric element 7. Further,
the reinforcing member 31 may have a shape including, for example, a plate-like portion and an
extending portion which is extended from the plate-like portion and covers the side portion of
the piezoelectric element 7. In this case, the extension portion and the side portion of the
11-04-2019
35
piezoelectric element 7 may be separated by a predetermined distance. This makes it difficult for
the extension portion to inhibit the deformation of the piezoelectric element.
[0119]
In the above embodiment, a mobile phone has been described as an example of an apparatus
according to the appended claims, but an apparatus according to the appended claims is not
limited to a mobile phone. The device according to the appended claims may be a portable
electronic device other than a mobile phone. Portable electronic devices include, but are not
limited to, tablets, portable personal computers, digital cameras, media players, electronic book
readers, navigators, and game consoles, for example.
[0120]
In order to fully and clearly disclose the technology of the attached claims, the description has
been made of the characteristic embodiments. However, the appended claims should not be
limited to the above embodiment, and all variations and alternatives that can be created by those
skilled in the art within the scope of the basic matters shown in the present specification are
possible. Should be configured to embody
[0121]
1A to 1C mobile phone 2 display 3 button 4 illuminance sensor 5 proximity sensor 6
communication unit 7 piezoelectric element 8 microphone 9 storage 9A control program 9B call
application 9C music playback application 9D video playback application 9Z setting data 10
controller 11 speaker 12 camera 15 attitude Detection unit 18 Vibrator 20 Panel 21 Touch
screen 30 Joint member 31 Reinforcement member 40 Case
11-04-2019
36
Документ
Категория
Без категории
Просмотров
0
Размер файла
57 Кб
Теги
description, jp2013236130
1/--страниц
Пожаловаться на содержимое документа