close

Вход

Забыли?

вход по аккаунту

?

JP2018060297

код для вставкиСкачать
Patent Translate
Powered by EPO and Google
Notice
This translation is machine-generated. It cannot be guaranteed that it is intelligible, accurate,
complete, reliable or fit for specific purposes. Critical decisions, such as commercially relevant or
financial decisions, should not be based on machine-translation output.
DESCRIPTION JP2018060297
Abstract: A sound is collected when a finger or a pen or the like that draws an image touches the
conference terminal. A conference terminal receives a drawing input from a user and also
displays a touch panel / display for displaying an image, a plurality of microphones for collecting
surrounding sound, and the touch panel / display drawn by the user. And a control unit which
determines a position of the microphone and selects a microphone for sound collection from the
plurality of microphones for collecting the sound according to the drawing position. [Selected
figure] Figure 2
Conference terminal, microphone selection method, and program
[0001]
The present invention relates to a conference terminal, a microphone selection method, and a
program.
[0002]
Video conferencing systems are known for conferencing between remote locations via a network.
In such a video conference system, digital data of an image and voice acquired at one remote
location is transmitted to another remote location, and the image and voice are output and
shared by the display and speaker at the other remote location. There is.
03-05-2019
1
[0003]
Further, in a video conference system, there is known a technology related to a conference
terminal having an interactive whiteboard (or electronic blackboard) realized by a touch panel
mounted display. Such a conference terminal converts an image drawn by the user on the touch
panel at one remote place into electrical image data and transmits the image data to the other
remote place. When receiving the image data, another remote conference terminal displays a
hand-drawn image on the display.
[0004]
However, in the above-described technology, there is a problem in that a sound is collected when
a finger or a pen drawing an image on a conference terminal touches.
[0005]
This invention is made in view of the above, Comprising: It aims at providing the meeting
terminal which can suppress the collection of the sound which draws an image on a meeting
terminal.
[0006]
In order to solve the problems described above and achieve the object, the conference terminal of
the present invention receives a drawing input from a user and also displays a touch panel /
display that displays an image, and a plurality of microphones that collect surrounding sound. A
control unit configured to select, from the plurality of microphones, a microphone for collecting
voice according to a position on the touch panel display drawn by the user.
[0007]
According to the present invention, it is possible to suppress the collection of sound at the time
of drawing.
[0008]
FIG. 1 is an overall configuration diagram of a video conference system according to the first
03-05-2019
2
embodiment.
FIG. 2 is a block diagram for explaining the hardware configuration of an interactive whiteboard
having a conference terminal function.
FIG. 3 is a front view of the input / output board unit.
FIG. 4 is a functional block diagram for explaining the function of the information processing unit
of the conference terminal in the microphone selection process.
FIG. 5 is a flowchart of the microphone selection process performed by the control unit. FIG. 6 is
a diagram showing a state in which the user is drawing in the upper left drawing area of the
touch panel display. FIG. 7 is a front view of the input / output board unit for explaining the first
modification. FIG. 8 is a front view of the input / output board unit for explaining the second
modification. FIG. 9 is a front view of the input / output board unit for explaining the third
modification. FIG. 10 is a front view of the input / output board unit for explaining the fourth
modification.
[0009]
The same components as those in the following exemplary embodiments and the like will be
denoted by the same reference numerals, and overlapping descriptions will be appropriately
omitted.
[0010]
First Embodiment FIG. 1 is an overall configuration diagram of a video conference system 10
according to a first embodiment.
As shown in FIG. 1, the video conference system 10 includes a conference reservation server 12,
a plurality of conference terminals 14a, 14b, 14c, and 14d, and a conference server 16. When it
is not necessary to distinguish the plurality of conference terminals 14a, 14b, 14c, and 14d, the
conference terminals 14 are described. The conference reservation server 12, the conference
server 16, and the plurality of conference terminals 14 are connected via a communication
03-05-2019
3
network 18 such as the Internet and LAN (Local Area Network) capable of transmitting and
receiving data such as image data and audio data. .
[0011]
An example of the conference reservation server 12 is a computer. The conference reservation
server 12 may be connected to another personal computer. The meeting reservation server 12
acquires, from the meeting terminal 14, meeting information input by a user such as a meeting
organizer. The meeting information is, for example, information on meeting date and time, place,
meeting participant, role, participating meeting terminal, and the like. The conference reservation
server 12 may acquire conference information from another personal computer other than the
conference terminal 14.
[0012]
The plurality of conference terminals 14 are installed at positions away from one another (for
example, remote locations). The conference terminal 14 is, for example, a computer having an
interactive whiteboard function. The interactive whiteboard function is a function realized by a
display capable of communicating with a touch panel. Specifically, the interactive whiteboard
function transmits / receives image data of an image to be displayed according to an instruction
from the computer and drawing data obtained by converting an image drawn by the user on the
screen to / from another conference terminal 14 Do. When activated, the conference terminal 14
inquires of the conference reservation server 12 whether or not there is video conferencing
conference information registered as the conference terminal 14 to which the conference
terminal 14 participates. When there is a video conference in which the conference terminal 14
participates, the conference terminal 14 performs control of the video conference based on a
preset program for video conference and the like. For example, the conference terminal 14
transmits, to the conference server 16, image data and audio data acquired from the user during
a video conference. The conference terminal 14 receives, from the conference server 16, image
data and audio data input by the user at another conference terminal 14 and outputs an image
and audio.
[0013]
An example of the conference server 16 is a computer. The conference server 16 monitors the
03-05-2019
4
state of whether each conference terminal 14 is connected to itself via the communication
network 18. The conference server 16 executes call control of the conference terminal 14 at the
start of the conference based on the conference information of the conference reservation server
12. The conference server 16 performs control of a video conference including transmission and
reception of information with the conference terminal 14.
[0014]
FIG. 2 is a block diagram for explaining the hardware configuration of an interactive whiteboard
having a conference terminal function. As shown in FIG. 2, the conference terminal 14 includes
an input / output board unit 20, a communication I / F unit 22, an operation unit 24, and an
information processing unit 26.
[0015]
The input / output board unit 20 has a camera 30, a touch panel / display 32, a plurality of
microphones 34a, 34b, 34c, 34d,. In the following description, the microphones 34a, 34b, 34c
and 34d are described as the microphone 34 when it is not necessary to distinguish them. The
camera 30, the touch panel display 32, the microphone 34, and the speaker 36 are connected to
the information processing unit 26 so as to transmit and receive data.
[0016]
An example of the camera 30 is an imaging device having a CMOS sensor, a CCD sensor, and the
like. The camera 30 picks up an image of a subject around and generates image data such as a
moving image. It is preferable that the camera 30 can capture the surroundings in a wide angle
range. For example, the camera 30 captures an image of a participant or the like of the
conference room in which the conference terminal 14 participating in the video conference is
installed to generate image data. The camera 30 transmits the captured image data to the
information processing unit 26.
[0017]
03-05-2019
5
The touch panel display 32 has a display function of displaying an image by a liquid crystal
display device or the like, and a touch panel function of receiving an input by drawing from a
user (for example, a participant of a video conference). For example, the touch panel display 32
acquires image data transmitted from another conference terminal 14 participating in a video
conference from the information processing unit 26 and displays the image. When the touch
panel / display 32 receives an input by the user touching with a finger or a pen, the touch panel
/ display 32 transmits coordinate data of the touched position to the information processing unit
26. The touch panel display 32 may have a hovering detection function that can receive an input
without the user directly touching it. An example of the touch panel display 32 having a hovering
detection function is a display integrated capacitive touch panel.
[0018]
The microphone 34 collects surrounding sounds, converts the sounds into electrical sound data,
and transmits the data to the information processing unit 26. For example, the microphone 34
collects voice such as the speech of the video conference participant, converts it into voice data,
and transmits it.
[0019]
The speaker 36 converts the voice data received from the information processing unit 26 into
voice and outputs the voice to the outside. For example, the speaker 36 acquires, from the
information processing unit 26, audio data transmitted by another conference terminal 14
participating in the video conference, and outputs the audio.
[0020]
The communication I / F unit 22 connects to the information processing unit 26 and the
communication network 18 so as to transmit and receive data. Thereby, the communication I / F
unit 22 connects the information processing unit 26 to the conference reservation server 12, the
conference server 16, and other conference terminals 14 so as to transmit and receive data such
as image data and audio data. The communication I / F unit 22 is, for example, an interface of a
wired LAN connectable to Ethernet (registered trademark) corresponding to at least one of
10Base-T, 100Base-TX, and 1000Base-T. Also, the communication I / F unit 22 may be a wireless
LAN interface compatible with 802.11a / b / g / n.
03-05-2019
6
[0021]
The operation unit 24 receives an input from the user. The operation unit 24 includes, for
example, at least one of a keyboard, operation buttons, and a mouse. The operation unit 24 is
connected to the information processing unit 26 so as to transmit and receive data. The
operation unit 24 transmits data related to the input received from the user to the information
processing unit 26.
[0022]
The information processing unit 26 is in charge of overall control of the conference terminal 14.
An example of the information processing unit 26 is a computer. The information processing unit
26 may be provided integrally with the input / output board unit 20, or may be provided as a
separate component from the input / output board unit 20. The information processing unit 26
includes a control unit 40, a storage device 42, and a memory 44.
[0023]
The control unit 40 is, for example, an arithmetic processing unit such as a processor of
hardware including a CPU (Central Processing Unit). The control unit 40 is connected to the
camera 30, the touch panel / display 32, the microphone 34, the speaker 36, the communication
I / F unit 22, and the operation unit 24 so as to transmit and receive data. The control unit 40
executes device control. Specifically, the control unit 40 controls devices of the camera 30, the
touch panel display 32, the microphone 34, the speaker 36, and the operation unit 24. For
example, the control unit 40 acquires image data of a user or the like in the conference room
from the camera 30. The control unit 40 transmits voice data received from another conference
terminal 14 or the like to the speaker 36 to output voice. The control unit 40 transmits image
data and drawing data received from another conference terminal 14 or the like to the touch
panel / display 32 to display an image. The control unit 40 acquires voice data such as voice
spoken by the user from the microphone 34. The control unit 40 generates drawing data based
on the coordinate data of the image drawn by the user acquired from the touch panel / display
32. The control unit 40 may execute control of the video conference by a CODEC (Coder /
Decoder) function. The CODEC function of the control unit 40 is, for example, H.264. H.264 /
AVC, H.264. H.264 / SVC and H.264. It is 265 mag. For example, after performing processing
03-05-2019
7
such as encoding on image data, audio data, and drawing data in a video conference, the control
unit 40 transmits the data to the conference server 16 via the communication I / F unit 22. Good.
The control unit 40 performs processing such as decoding on image data, audio data, drawing
data, etc. of another conference terminal 14 acquired from the conference server 16 via the
communication I / F unit 22, and then the data is displayed on the touch panel It may be sent to
32 and to a speaker 36.
[0024]
The storage device 42 stores a program for microphone selection processing executed by the
device control executed by the control unit 40 and the control of the video conference, and data
such as parameters required for the execution of the program. The storage device 42 is, for
example, a non-volatile storage medium such as a flash memory, a hard disk drive (HDD), and a
solid state drive (SSD).
[0025]
The memory 44 temporarily stores expansion of a program to be executed by the control unit 40
and operation data during program execution. The memory 44 is, for example, a volatile memory
such as a DDR SDRAM (Double-Data-Rate SDRAM).
[0026]
FIG. 3 is a front view of the input / output board unit 20. As shown in FIG. As shown in FIG. 3, the
touch panel display 32 is disposed substantially at the center of the input / output board unit 20.
When the user moves the touch panel display 32 while touching the pen 46 or the like, the touch
panel display 32 displays the image DR along the trajectory of the movement. The touch panel
display 32 has four drawing areas AR1, AR2, AR3, and AR4 divided by 2 × 2 in the horizontal
direction and the vertical direction. If there is no need to distinguish between the drawing areas
AR1, AR2, AR3 and AR4, they are described as drawing areas AR. Although the drawing area AR
is divided by adding dotted lines to the drawings for the sake of explanation, the four drawing
areas AR may not be physically divided. Preferably, the user should not be able to view the four
drawing areas AR. The four drawing areas AR have, for example, the same area. The drawing area
AR1 is disposed at the upper left of the touch panel display 32. The drawing area AR2 is
disposed at the upper right of the touch panel display 32. The drawing area AR3 is disposed at
03-05-2019
8
the lower right of the touch panel display 32. The drawing area AR4 is disposed at the lower left
of the touch panel display 32.
[0027]
The camera 30 is disposed above the touch panel display 32 and at the center of the upper
portion of the input / output board unit 20.
[0028]
The speaker 36 is disposed below the touch panel display 32 and below the input / output board
unit 20.
[0029]
As shown in FIG. 3, the input / output board unit 20 of the present embodiment has four
microphones 34 a, 34 b, 34 c, 34 d.
The number of microphones 34 is the same as the number of drawing areas AR.
In other words, the number and arrangement of the drawing areas AR are preferably set in
accordance with the number of the microphones 34. The four microphones 34 are arranged at
mutually separated positions. Specifically, the microphone 34 a is disposed at the upper left of
the input / output board unit 20. The microphone 34 b is disposed at the upper right of the input
/ output board unit 20. The microphone 34 c is disposed at the lower right of the input / output
board unit 20. The microphone 34 d is disposed at the lower left of the input / output board unit
20. That is, the four microphones 34a, 34b, 34c and 34d are arranged in association with the
drawing areas AR1, AR2, AR3 and AR4, respectively. Specifically, the microphone 34a is arranged
in association with the vicinity of the drawing area AR1. The microphone 34 b is arranged in
association with the vicinity of the drawing area AR2. The microphone 34c is arranged in
association with the vicinity of the drawing area AR3. The microphone 34 d is arranged in
association with the vicinity of the drawing area AR4.
[0030]
03-05-2019
9
Next, the operation of the video conference system 10 described above will be described. Here,
as an example, a video conference in which the conference terminals 14a, 14b, and 14c
participate will be described. When the time of the video conference registered in the conference
information of the conference reservation server 12 comes, the conference server 16 calls the
conference terminal 14 registered as the conference terminal 14 participating in the conference
information. Thereby, the conference terminal 14 starts transmission and reception of data with
the conference terminal 14 according to the call.
[0031]
For example, image data of a user (for example, a conference participant) captured by the camera
30 by the conference terminal 14a, voice data such as a user's speech acquired by the
microphone 34, and an image drawn by the user on the touch panel / display 32 It sends
drawing data and the like to the conference server 16. In this case, the conference server 16
transmits the received data to the participating conference terminals 14b and 14c, but does not
transmit the data to the conference terminal 14d not participating in the video conference.
Thereby, the conference terminals 14b and 14c participating in the video conference reflect the
image of the image data and the drawing data on the touch panel display 32 and display them
based on the received data, and the sound of the audio data is the speaker 36 Output from
Similarly, when the conference server 16 receives data from the conference terminal 14b (or the
conference terminal 14c), the conference server 16 transmits data to the conference terminals
14a and 14c (or the conference terminals 14a and 14b). Do not send Thereby, the video
conference system 10 realizes a video conference between multiple sites by the plurality of
conference terminals 14 registered in advance in the conference information among the plurality
of conference terminals 14.
[0032]
Here, in the present embodiment, the conference terminal 14 transmits only the audio data
generated by the microphone 34 selected by the microphone selection process described later to
the conference server 16 when the predetermined condition is satisfied. On the other hand, when
the conference terminal 14 does not satisfy the predetermined condition, the conference terminal
14 transmits, to the conference server 16, the voice data acquired by the microphone 34 having
the highest collected voice collecting level. An example of the predetermined condition is the
time when the user is drawing an image or the like on the touch panel / display 32, and the time
until the predetermined judgment time elapses from the end of drawing.
03-05-2019
10
[0033]
FIG. 4 is a functional block diagram for explaining the function of the information processing unit
26 of the conference terminal 14 in the microphone selection process. As shown in FIG. 4, the
information processing unit 26 includes a determination unit 50, a selection unit 52, and a
storage unit 54. The determination unit 50 and the selection unit 52 are functions implemented
by being expanded in the memory 44 by the control unit 40 reading a program for microphone
selection processing stored in the storage unit 54. Note that part or all of the determination unit
50 and the selection unit 52 may be configured by hardware such as a circuit including an
application specific integrated circuit (ASIC). The storage unit 54 is a function realized by the
storage device 42 and the memory 44.
[0034]
The determination unit 50 performs various determinations in the microphone selection process
performed in the control process of the video conference. For example, the determination unit 50
determines whether the video conference has ended, that is, whether the video conference is
continuing. When determining that the determination unit 50 is continuing, the determination
unit 50 outputs a determination result to the effect that the determination unit 50 is continuing
to the selection unit 52.
[0035]
The determination unit 50 determines whether the user is drawing on the screen of the touch
panel display 32 or not. For example, when it is determined that drawing is being performed on
the touch panel / display 32 by receiving coordinate data, the determination unit 50 determines
the position on the touch panel / display 32 drawn by the user, and the drawing position is It is
determined whether it is included in the drawing area AR of The determination unit 50 outputs
information on the drawing area AR of the drawing position of the user to the selection unit 52.
[0036]
The determination unit 50 determines whether or not a determination time has elapsed since the
03-05-2019
11
end of drawing on the touch panel / display 32. The determination time is predetermined and
stored in the storage unit 54. Note that the determination unit 50 does not execute the
determination as to whether or not the above-described video conference is ended until it is
determined that the determination time or more has elapsed from the end of drawing after the
user once draws.
[0037]
When the selection unit 52 acquires the determination result that the video conference is
continuing from the determination unit 50, the selection unit 52 selects the microphone 34 with
the highest sound collection level as the microphone 34 for sound collection.
[0038]
The selection unit 52 selects a microphone 34 for collecting voice from the plurality of
microphones 34 for collecting voice.
For example, when the user is drawing on the touch panel display 32, the selection unit 52 is
used to collect the microphones 34 set in advance according to the drawing position on the
touch panel display 32 from the plurality of microphones 34. The microphone 34 is selected.
Specifically, when acquiring the information of the drawing area AR including the position drawn
by the user from the determination unit 50, the selecting unit 52 selects the microphone 34 for
sound collection based on the drawing area AR. For example, the selection unit 52 selects the
microphone 34 farthest from the drawing area AR, that is, the microphone 34 farthest from the
drawing position as the microphone 34 for sound collection.
[0039]
Here, the selection unit 52 does not acquire the determination result to the effect that the
determination unit 50 is continuing until the determination time or more elapses from the end of
drawing by the user. Therefore, after the user draws once, the selection unit 52 draws the
position without selecting the microphone 34 having the highest sound collection level until the
determination time or more elapses from the end of drawing on the touch panel / display 32. The
selected microphone 34 is continued as the microphone 34 for sound collection. On the other
hand, after the user draws once, the selection unit 52 determines that the video conference is
continuing from the determination unit 50 after the judgment time has elapsed from the end of
03-05-2019
12
drawing on the touch panel / display 32, and the voice is voiced. The microphone 34 with the
highest sound collection level is selected as the microphone 34 for collection.
[0040]
The storage unit 54 stores data such as a program and parameters required to execute the
program. For example, the storage unit 54 stores a program for microphone selection processing.
The storage unit 54 stores coordinate data of the drawing area AR of the touch panel / display
32, the position of each microphone 34, the determination time, and the like.
[0041]
The program for the microphone selection process executed by the control unit 40 of the
conference terminal 14 of the present embodiment has a module configuration including the
above-described units (determination unit 50 and selection unit 52). As actual hardware, when
the control unit 40 reads out and executes a program for microphone selection processing from
the storage device 42, the above-described units are loaded onto the main storage device.
Thereby, the determination unit 50 and the selection unit 52 are generated on the main storage
device, and these functions are realized by the computer.
[0042]
For example, the program for the microphone selection process executed by the control unit 40
of the conference terminal 14 of the present embodiment is provided by being incorporated in
advance in the storage device 42 or the like. The program for microphone selection processing
executed by the control unit 40 of the conference terminal 14 of the present embodiment is a file
of an installable format or an executable format, and is a CD-ROM, a flexible disk (FD), a CD-R, or
the like. It may be configured to be recorded and provided in a computer readable recording
medium such as a DVD (Digital Versatile Disk).
[0043]
Furthermore, a program for microphone selection processing executed by the control unit 40 of
03-05-2019
13
the conference terminal 14 of the present embodiment is stored on a computer connected to a
network such as the Internet and provided by being downloaded via the network. You may
configure. Further, the program for the microphone selection process executed by the control
unit 40 of the conference terminal 14 of the present embodiment may be provided or distributed
via a network such as the Internet.
[0044]
FIG. 5 is a flowchart of the microphone selection process performed by the control unit 40. The
control unit 40 executes a microphone selection process, which is an example of a microphone
selection method, as a part of the video conference control process by reading a program for the
microphone selection process.
[0045]
As shown in FIG. 5, in the microphone selection process, when the video conference starts, the
determination unit 50 of the control unit 40 determines whether the video conference has ended
(S102). For example, the determination unit 50 determines the end of the video conference
depending on whether the user operates the operation unit 24 to input an instruction to end the
video conference. If it is determined that the video conference is not completed, that is, the video
conference is continuing (S102: No), the determination unit 50 outputs the determination result
to the selection unit 52.
[0046]
When the selection unit 52 acquires the determination result that the video conference is
continuing from the determination unit 50, the selection unit 52 selects the microphone 34 with
the largest sound collection level as the microphone 34 for voice collection (S104). Thereby, the
control unit 40 transmits the voice data of the voice collected by the microphone 34 having the
largest sound collection level to the conference server 16.
[0047]
03-05-2019
14
The determination unit 50 determines whether a user (for example, a conference participant) is
drawing on the touch panel display 32 (S108). For example, the determination unit 50
determines whether or not drawing is performed depending on whether or not coordinate data of
a position drawn by the user is received from the touch panel / display 32. If the determination
unit 50 determines that the user is not drawing on the touch panel / display 32 (S108: No), it
repeats step S102 and subsequent steps.
[0048]
On the other hand, when determining unit 50 determines that the user is drawing on touch panel
/ display 32 (S108: Yes), the coordinate data acquired from touch panel / display 32 and the
coordinate data of the drawing area stored in storage unit 54 It is determined which drawing
area AR the drawing position is based on. FIG. 6 is a diagram showing a state in which the user is
drawing in the drawing area AR1 at the upper left of the touch panel display 32. As shown in FIG.
[0049]
The determination unit 50 determines whether the drawing position is the drawing area AR1 at
the upper left (S110). When the determination unit 50 determines that the drawing position is
the drawing area AR1 at the upper left (S110: Yes), the determination unit 50 outputs the
information of the drawing area AR1 to the selection unit 52. When acquiring the information of
the drawing area AR1 from the determination unit 50, the selection unit 52 selects the
microphone 34c at the lower right which is hardest to collect the sound of drawing from the
drawing area AR1 as the microphone 34 for sound collection (S112).
[0050]
If the determination unit 50 determines that the drawing position is not the drawing area AR1 at
the upper left (S110: No), it determines whether the drawing position is the drawing area AR2 at
the upper right (S114). If the determination unit 50 determines that the drawing position is the
drawing area AR2 in the upper right (S114: Yes), the determination unit 50 outputs the
information of the drawing area AR2 to the selection unit 52. When acquiring the information of
the drawing area AR2 from the determination unit 50, the selection unit 52 selects the lower left
microphone 34d which is most difficult to collect drawing sound from the drawing area AR2 as
03-05-2019
15
the microphone 34 for sound collection (S116).
[0051]
If the determination unit 50 determines that the drawing position is not the drawing area AR2 in
the upper right (S114: No), it determines whether the drawing position is the drawing area AR3
in the lower right (S118). When determining that the drawing position is the lower right drawing
area AR3 (S118: Yes), the determination unit 50 outputs the information of the drawing area AR3
to the selection unit 52. When acquiring the information of the drawing area AR3 from the
determination unit 50, the selection unit 52 selects the microphone 34a on the top left which is
hardest to collect the sound of drawing from the drawing area AR3 as the microphone 34 for
sound collection (S120).
[0052]
If the determination unit 50 determines that the drawing position is not the lower right drawing
area AR3 (S118: No), the drawing position is the remaining lower left drawing area AR4, so the
determination unit 50 The information of the drawing area AR4 is output to the selection unit
52. When acquiring the information of the drawing area AR4 from the determination unit 50, the
selection unit 52 selects the microphone 34b on the upper right which is the hardest to collect
the sound of drawing from the drawing area AR4 as the microphone 34 for sound collection
(S122).
[0053]
When the selection unit 52 selects any one of the microphones 34 in steps S112, S116, S120,
and S122, the determination unit 50 determines whether or not the determination time stored in
advance in the storage unit 54 has passed in advance. (S124). The determination unit 50 may
determine that the drawing is completed when the drawing coordinate data is not received from
the touch panel / display 32. If the determination unit 50 determines that the determination time
has not elapsed from the end of drawing (S124: No), step S124 is repeated to enter the standby
state. As a result, sound collection is continued by the microphone 34 selected based on the
drawing area AR including the position during drawing by the selection unit 52.
03-05-2019
16
[0054]
If the determination unit 50 determines that the determination time has elapsed since the end of
drawing (S124: Yes), it repeats step S102 and subsequent steps. Thereby, the selection unit 52
selects again the microphone 34 having the largest sound collection level in step S104.
Thereafter, when the determination unit 50 determines that the video conference has ended
(S102: Yes), the microphone selection process is ended together with the control process of the
video conference.
[0055]
As described above, in the conference terminal 14 of the video conference system 10, since the
selection unit 52 of the control unit 40 selects the microphone 34 farthest from the drawing area
AR drawn by the user, collection of sounds generated by drawing is performed. It can be
suppressed. As a result, the conference terminal 14 can suppress transmission of the sound
generated by drawing to the other conference terminal 14, and can suppress the output of noise
at the other conference terminal 14.
[0056]
In the conference terminal 14, the voice collection is continued by the microphone 34 selected in
the drawing area AR being drawn by the selection unit 52 until the judgment unit 50 of the
control unit 40 judges that the judgment time or more has elapsed from the drawing end. As a
result, when the user performs drawing across the drawing area AR, and even when the user
temporarily cancels drawing and makes a speech etc., the conference terminal 14 frequently and
improperly collects the microphone 34 of the voice collection. It is possible to reduce the natural
switching and reduce the discomfort due to the switching of the microphone 34.
[0057]
The modification which changed a part of embodiment mentioned above is demonstrated. In the
following modification, the touch panel display 32 is configured to be capable of receiving not
only the single touch described above but also multi-touch drawing at a plurality of locations
simultaneously.
03-05-2019
17
[0058]
First Modified Example FIG. 7 is a front view of the input / output board unit 20 for describing a
first modified example. In the situation shown in FIG. 7, the determination unit 50 determines
that the user is drawing at three locations on the touch panel / display 32 simultaneously, and
the information of the drawing areas AR1, AR3, and AR4 including the drawing position is Output
to the selection unit 52. When acquiring the information of the drawing areas AR1, AR3 and AR4,
the selecting unit 52 selects the microphone 34b farthest from any of the three drawing areas
AR1, AR3 and AR4 in drawing. For example, the selection unit 52 selects the microphone 34b
farthest from the center of gravity of the three drawing areas AR1, AR3, and AR4. In other words,
the selection unit 52 selects the microphone 34b closest to the drawing area AR2 not drawn.
[0059]
Second Modified Example FIG. 8 is a front view of the input / output board unit 20 for describing
a second modified example. In the situation shown in FIG. 8, the user is drawing at two positions
on the touch panel / display 32 simultaneously. In this case, the determination unit 50 outputs,
to the selection unit 52, information on a plurality of drawing areas AR1 and AR3 of the touch
panel / display 32 including the drawing position. When acquiring the information of the
drawing areas AR1 and AR3, the selecting unit 52 selects the microphone 34 farthest from any
of the drawing areas AR1 and AR3 being drawn. However, the user is drawing in two drawing
areas AR1 and AR3 (or drawing areas AR2 and AR4) where the user intersects, and a plurality of
microphones 34 (for example, two) farthest from the drawing position and the drawing area AR
In some cases, the selection unit 52 preferentially selects the upper microphone 34b that is
unlikely to be covered by the user's US body as the microphone 34 for voice collection. Similarly,
when the user US is drawing simultaneously in two drawing areas AR (for example, drawing
areas AR1 and AR4) arranged vertically, the selection unit 52 selects the upper microphone 34
(for example, the microphone 34b) Do. Thereby, the conference terminal 14 can suppress that
the user in drawing is blocking the microphone 34 and preventing collection of audio | voice.
[0060]
Third Modified Example FIG. 9 is a front view of the input / output board unit 20 for describing a
third modified example. In the situation shown in FIG. 9, the determination unit 50 determines
03-05-2019
18
that the user is drawing at two places on the touch panel / display 32 at the same time, and
selects the information of the drawing areas AR3 and AR4 including the drawing position. Output
to 52. When acquiring the information of the drawing areas AR3 and AR4, the selecting unit 52
selects the microphone 34 farthest from the drawing areas AR3 and AR4 being drawn. However,
when the user is drawing in two drawing areas AR3 and AR4 arranged in the horizontal direction,
there are two microphones 34 which are farthest from each other at the same height. Based on
the microphone 34 is selected.
[0061]
Fourth Modified Example FIG. 10 is a front view of the input / output board unit 20 for
describing a fourth modified example. As shown in FIG. 10, the input / output board unit 20 may
have six microphones 34a, 34b, 34c, 34d, 34e, 34f. The microphone 34 e is disposed at the
upper center of the input / output board 20. The microphone 34 f is disposed at the center of the
lower part of the input / output board 20. In this case, in accordance with the microphone 34,
the touch panel / display 32 may be subdivided to set six drawing areas AR1, AR2, AR3, AR4,
AR5, AR6. The drawing area AR5 is disposed above the center of the touch panel display 32. The
drawing area AR6 is disposed below the center of the touch panel display 32. Thereby, the
selection unit 52 can select the microphone 34 more accurately at the time of multi-touch.
[0062]
The function, arrangement, shape, number, and the like of each component of the abovedescribed embodiment may be changed as appropriate. In addition, the above-described
embodiment and each modification may be combined.
[0063]
For example, the number of microphones 34 and the number of drawing areas AR may be
changed as appropriate.
[0064]
The order of steps S110, S114, and S118 of determining the drawing area AR by the
determination unit 50 described above may be changed as appropriate.
03-05-2019
19
[0065]
The embodiments described above are presented as examples and are not intended to limit the
scope of the present invention.
This novel embodiment can be implemented in various other forms, and various omissions,
substitutions, and modifications can be made without departing from the scope of the invention.
The embodiments and the variations of the embodiments are included in the scope and the gist
of the invention, and are included in the invention described in the claims and the equivalent
scope thereof.
[0066]
14 conference terminal 20 input / output board unit 26 information processing unit (computer)
30 camera 32 touch panel / display 34 microphone 40 control unit AR ... drawing area
[0067]
JP, 2016-111645, A JP, 2012-203122, A
03-05-2019
20
Документ
Категория
Без категории
Просмотров
0
Размер файла
32 Кб
Теги
jp2018060297
1/--страниц
Пожаловаться на содержимое документа