close

Вход

Забыли?

вход по аккаунту

?

JP2002314987

код для вставкиСкачать
Patent Translate
Powered by EPO and Google
Notice
This translation is machine-generated. It cannot be guaranteed that it is intelligible, accurate,
complete, reliable or fit for specific purposes. Critical decisions, such as commercially relevant or
financial decisions, should not be based on machine-translation output.
DESCRIPTION JP2002314987
[0001]
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a
monitoring system for detecting the state of movement of a mobile. Among them, it relates
particularly to those using acoustic data and image data.
[0002]
2. Description of the Related Art As a conventional monitoring system, there is a device for
identifying and tracking a moving body described in JP-A-7-287781. In the present apparatus, a
sensor for detecting the position of the mobile object and a sensor for identifying the mobile
object are disposed at an appropriate interval to identify and track the mobile object. Specifically,
when a sensor for detecting the position of the moving object detects the position of the moving
object, shape data, sound data, weight data and the like of the moving object are recorded by the
sensor for identifying the moving object. The mobile is identified by correlating the various data
recorded with the various mobile data in the database. Also, the mobile object is tracked by
sequentially storing the position information of the identified mobile object in the database.
[0003]
In the above-mentioned prior art, when an unknown moving body is observed, the characteristics
of the moving body recorded by various sensors are registered in a database, and identification
10-05-2019
1
and tracking are performed. However, since the features to be observed fluctuate in the outdoors
where the observation conditions are not constant, it is difficult to perform subsequent
identification and tracking accurately with the method of registering the unknown mobile object
in the database at the time of observation. Moreover, in the prior art, it was not possible to
analyze the moving object in detail simultaneously with tracking.
[0004]
For unknown mobiles, it is more accurate and efficient for the operator of the surveillance
system to perform identification and analysis. Therefore, in the present invention, it is an object
of the present invention to provide a monitoring system in which an operator can efficiently
detect an event to be monitored, track a moving object, and confirm a feature by using sounds
and images recorded in the monitoring area. I assume.
[0005]
SUMMARY OF THE INVENTION In order to achieve the above object, the present invention
cooperates with acoustic data and image data to detect a moving object. That is, in the case of
detecting a moving object, the detection results of each other are complemented.
[0006]
More specifically, the present invention is configured as follows. (1) Accumulate acoustic data in
the monitoring area using a plurality of acoustic sensors. In addition, image data of major
locations in the monitoring area are stored using an image sensor. When investigating an event
to be monitored, accumulated acoustic data is analyzed to detect the event and track a moving
object related to the event. Further, the time when the moving object enters the imaging range of
the image sensor is obtained, and the image data of the moving object is acquired from the image
data in a predetermined range of time from that time. (2) An event to be monitored is detected
from acoustic data recorded by an acoustic sensor, or a signal waveform, and a frequency
analysis result of the acoustic data. At this time, the sound data, the signal waveform, and the
analysis result may be output. (3) Based on acoustic data recorded by a plurality of acoustic
sensors or frequency analysis results of acoustic data, acoustic data recorded by a plurality of
acoustic sensors are compared to track a moving body. At this time, the sound data and the
analysis result may be output. (4) Monitor occurrence of an event to be monitored using a
10-05-2019
2
plurality of acoustic sensors. Moreover, the main location in a monitoring area | region is
monitored using the detection result in an image sensor. The position of each sensor and the data
collection range are displayed on the map. Thereby, the operator can confirm the position of
each sensor and the observation data of each sensor. When an event to be monitored is detected,
mobile objects related to the event are tracked based on the acoustic data. At this time, the point
of occurrence of the event and the position of the moving object are sequentially displayed on
the map. The operator can monitor the event or the moving body by comparing position
information such as the occurrence point of the event or the moving path of the moving body
with the acoustic data or image data under observation.
[0007]
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS The present invention relates to a
system for monitoring a desired event to be monitored and a related moving object by a plurality
of acoustic sensors and image sensors. Specifically, this can be realized by creating software or
hardware that implements the following method.
[0008]
(1)First Embodiment The first embodiment detects a desired event based on acoustic data and
image data observed and stored using a plurality of acoustic sensors and image sensors, and a
mobile object related to the desired event. The present invention relates to a method of indexing
a moving path and extracting an image of a moving object.
[0009]
In this embodiment, a plurality of acoustic sensors and an image sensor are used to store
acoustic data in the monitoring area and image data of the main location.
Then, the accumulated acoustic data is analyzed to detect an event to be monitored and to track a
moving object related to the event. Further, the time when the moving object enters the imaging
range of the image sensor is obtained, and the image data of the moving object is acquired from
the image data around that time.
[0010]
10-05-2019
3
Hereinafter, an example of a surveillance system on a general road is shown. FIG. 1 is a block
diagram of a monitoring system in an embodiment of the present invention. The sensor 101 is an
acoustic sensor that observes surrounding sound, and the sensor 102 is an image sensor that
captures a predetermined place. The accumulation unit 103 constantly observes acoustic data
and image data observed by the plurality of acoustic sensors 101 and the image sensor 102, and
accumulates each data together with time information and position information. The analysis unit
104 analyzes the accumulated sound data and image data.
[0011]
FIG. 2 shows an arrangement example of the acoustic sensors 211-213 and the image sensors
221-224 in the monitoring area 201. The acoustic sensors 211-213 are arranged to be able to
record all the sound of the event 202 (explosion etc.) to be monitored generated in the
monitoring area 201 and the sound of the moving object 203 (car etc.) related to the event 202.
Specifically, a monitorable area capable of recording sound is determined by one acoustic sensor,
and the monitorable areas of the plurality of acoustic sensors 211-213 all include the monitoring
area 201, and each acoustic sensor 211-213 It is conceivable to arrange the acoustic sensors
211-213 so as to minimize the overlap of the monitorable areas. The acoustic sensors 211 to
213 may be any type as long as acoustic data can be input, and in the present embodiment,
monaural or stereo microphones are used.
[0012]
The image sensors 221 to 224 are arranged to be able to record an image of a predetermined
range 204 (such as an intersection). As the image sensors 221 to 224, an imaging device such as
an ITV camera is used.
[0013]
FIG. 3 shows a processing procedure in the analysis unit 104. The analysis unit 104 performs
detection of the event 202 and tracking of the moving body 203 using the accumulated acoustic
data, and extracts an image obtained by capturing the moving body 203 from the image data
accumulated in consideration of the tracking result. Hereinafter, the processing procedure will be
10-05-2019
4
described with the numbers in FIG. 3 as the process step numbers. Step 301: Event detection The
event 202 and the mobile unit 203 are detected using the acoustic data accumulated.
[0014]
FIG. 4 is an explanatory view of an event detection process. As shown in FIG. 4, the operator 402
listens to the acoustic data observed by the acoustic sensors 211-213 in the area 401 to be
examined to determine the type of sound, and detects the event 202 and the moving body 203.
In the example of FIG. 4, the event 202 can be detected by listening to acoustic data observed by
the acoustic sensor 211 or the acoustic sensor 212.
[0015]
Also, as shown in FIG. 5, the signal waveform graph 511-513 of the acoustic data observed by
the acoustic sensor 211-213 in the region 401 to be examined and the time frequency analysis
image 521-523 of the acoustic data are visually confirmed By doing this, the event 202 can also
be detected. Signal waveform graphs 511 to 513 are graphs in which sound data observed by
taking time on the horizontal axis and amplitude on the vertical axis is displayed in time series.
The time-frequency analysis images 521 to 523 are images in which the horizontal axis
represents frequency and the vertical axis represents time, and the intensity of the frequency
spectrum of acoustic data is two-dimensionally displayed as a luminance value. As a calculation
method of the frequency spectrum, various frequency analysis techniques such as Fourier
transform and wavelet transform described in the following document 1 can be used. Svend
Gade, Klaus Gram-Hansen, "The Analysis of Nonstationary Signals", Sound and Vibration, (1), pp.
40-46, 1997. (Document 1) Sounds generated by the event 202 and the moving object 203 are
changes in amplitude 531 and 532 on the signal waveform graphs 511 and 512, and points and
lines 541 having high luminance values on the time frequency analysis images 521 and 522. ,
542 can be confirmed.
[0016]
When the sound of a plurality of things is mixed in the sound data, the sound data related to each
thing is separated and the event 202 and the moving body 203 are detected. As a separation
method of acoustic data, various signal separation techniques such as independent component
analysis shown in the following document 2 can be used. Ikeda, S., "Application of Independent
10-05-2019
5
Component Analysis to Signal Processing," Measurement and Control, Vol. 38, No. 7, pp. 461467, 1997. (Document 2) Step 302: Moving Object Tracking The moving object 203 detected in
Step 301 is tracked using the acoustic data accumulated.
[0017]
FIG. 6 is an explanatory view of the sound tracking process. As shown in FIG. 6, the operator 601
selects the two acoustic sensors 212 and 213 and listens to the acoustic data observed by each
sensor simultaneously with the left and right separate ears, thereby confirming the moving
direction of the moving body 203. In the example of FIG. 6, the operator 601 feels that the
mobile unit 203 is moving from the left to the right in front of the eye. Similarly, by sequentially
selecting two acoustic sensors and listening to acoustic data, it is possible to track the moving
body 203 and obtain a movement path.
[0018]
In addition, the moving body 203 can also be tracked by comparing the signal waveform graphs
711 to 713 and the time frequency analysis images 721 to 723 of the acoustic data shown in
FIG. 7 for two or more acoustic sensors. In the example of FIG. 7, the sound generated by the
moving object is captured as characteristic waveforms 732 and 733 or lines 742 and 743 with
high luminance values. When the acoustic sensor 212 and the acoustic sensor 213 are
compared, a waveform 733 similar to the characteristic waveform 732 on the signal waveform
graph 712 appears on the signal waveform graph 713 with a time delay. Further, it can be
understood from the time-frequency analysis image 722 that the observed acoustic data has the
frequency component f1. On the other hand, on the time-frequency analysis image 723, the same
frequency component f1 appears with a delay in time. That is, it can be seen that the moving
body 203 has moved in the direction from the vicinity of the acoustic sensor b toward the
vicinity of the acoustic sensor c. Step 303: Extract a captured image of the moving body 203
from the image data extracted and stored in the moving body image.
[0019]
FIG. 8 is an explanatory diagram of the image extraction process. First, based on the movement
path information obtained in step 302, the moving body 203 specifies an image sensor which
has passed the monitoring range, and obtains a passing time t0. In the example of FIG. 8, the
10-05-2019
6
mobile unit 203 passes through the monitoring range 204 of the image sensors 221-224. Next,
the photographed image of the moving object is visually extracted from the image data 811-814
before and after time t0, which is observed by the specified image sensor 221-224.
[0020]
Further, by analyzing the photographed image of the moving body 203 in detail, it is possible to
detect characteristic information such as a driver, a car type, a car license plate and the like. FIG.
9 shows an example of analysis when the mobile unit 203 is a car. An image 901 is a driver of a
car, an image 902 is a shape of the car, and an image 903 is a car number. By detecting the
features as shown in FIG. 9, it is possible to resolve the event 202 early. (2)Second Embodiment
The second embodiment detects an event to be monitored based on acoustic data and image data
observed using a plurality of acoustic sensors and image sensors, and a moving object related to
the event. Tracking of the object, and a method of performing confirmation by the image of the
moving object. The difference from the first embodiment is that monitoring of events and moving
objects is performed in real time.
[0021]
In this embodiment, a plurality of acoustic sensors are used to constantly monitor occurrences of
events to be monitored and moving objects. In addition, the image sensor is used to constantly
capture the main location in the monitoring area. The position of each sensor and the data
collection range are displayed on the map. The operator can confirm the position of each sensor
and the observation data of each sensor. When the occurrence of an event to be monitored is
detected, a mobile related to the event is tracked. The occurrence point of the event and the
position of the moving object are sequentially displayed on the map. The operator monitors the
event or the moving body by comparing position information such as the occurrence point of the
event or the moving path of the moving body with the acoustic data or image data under
observation. Hereinafter, an example of a surveillance system on a general road is shown.
[0022]
FIG. 10 shows an example of monitoring in an event or mobile monitoring system. A map 1001
represents a monitoring area, and symbols 1011 to 1013 indicate positions of acoustic sensors,
and symbols 1021 to 1024 indicate positions of image sensors.
10-05-2019
7
[0023]
The specific monitoring method is shown below. When an event to be monitored is detected by
the monitoring system, the occurrence point of the event is displayed on the map 1001 with a
symbol 1002. Further, the status of tracking of a moving object using an acoustic sensor is
displayed on a map 1001 with a symbol 1003. The operator selects a sensor with the mouse
1004 or the like while observing the moving state 1003 of the moving object, and confirms the
data observed by the sensor. In response to an instruction from the operator, sound data 1005,
image data 1006, analysis results 1007 of moving objects, and the like currently displayed or
observed in the past are displayed.
[0024]
Hereinafter, a method of detecting an event, a method of tracking a moving object, and a method
of extracting an image of a moving object will be specifically described.
[0025]
As a method of detecting an event, threshold processing using the threshold value Ts is
performed on the observed acoustic data, and a method is considered that a desired event is
detected when the acoustic data is equal to or greater than the threshold value Ts.
In the example of FIG. 5, the signals 531 and 532 are detected by thresholding.
[0026]
Further, threshold processing may be similarly performed on the frequency spectrum of the
observed acoustic data, and a peak of the frequency spectrum may be detected to detect a
desired event. In the example of FIG. 5, points with high luminance values and lines 541 and 542
are detected by threshold processing. Various other signal detection methods can be used as an
automatic event detection method.
10-05-2019
8
[0027]
As a method of tracking a moving object, threshold processing is performed on the acoustic data
and frequency spectrum as in the case of the above detection, and when the acoustic data and
frequency spectrum above a predetermined level are detected, it passes near the acoustic sensor
Use a method that assumes Because the position of the acoustic sensor is known, the mobile can
be tracked. In the example of FIG. 7, the signal 732 at time t1 and the signal 733 at time t2 are
detected by the threshold processing, and the position of the mobile can be grasped at each time.
Similarly for the frequency spectrum, the position of the moving body can be determined by
detecting the lines 742 and 743 with high luminance values.
[0028]
Also, there is also a method in which acoustic data observed by adjacent acoustic sensors are
correlated, and the moving direction of the moving object is the direction toward the acoustic
sensor that observed the acoustic data having high correlation and the same waveform appears
delayed in time. is there. In the example of FIG. 7, correlation between the signals 711 and 712
and between the signals 712 and 713 indicates that the correlation between the signals 712 and
713 is high and that the signal 713 is delayed in time. That is, it can be understood from the
acoustic sensor b that the moving body is moving in the direction of the acoustic sensor c. As an
automatic tracking method of a mobile, various position detection methods such as the method
of the following document 3 can be used. Kobayashi Kazunori et al., "Multi-Speaker Free
Placement Multiple Speaker Position Estimation", Shingaku Theory A, Vol. 82, No. 2, pp. 193200, 1999. (Document 3) As a method of extracting an image of a moving object, various
detection techniques such as a method using a distribution pattern inside the object, a method
using an outline shape of the object, and a method described in the following document 4 are
used. can do. Mitani, S. et al., "Automotive Detection by Gabor Transform", Shingaku Theory D-II,
Vol. (Reference 4) (3) Specific Example of Event to be Monitored and Moving Object In the above
embodiment, a specific example of the monitoring system in the present invention has been
shown. In the following, specific examples of events and moving objects to be monitored in the
monitoring system of the present invention will be shown. (3-1) Monitoring of traffic accidents
The monitoring place is a road, and microphones are installed on utility poles and outside lights
along the road to record sounds on the road. In addition, a camera is installed at the intersection
to record images in the vicinity of the intersection.
[0029]
10-05-2019
9
Since the event to be monitored is a traffic accident, sounds related to the event include a
crashing sound of a car, a very loud braking sound, and a human scream. Also, vehicles related to
events are mainly cars.
[0030]
A monitoring method when the monitoring system of the present invention is applied to traffic
accident monitoring is shown below. The operator confirms the observed acoustic data, paying
attention to the sound associated with the traffic accident. Confirmation of acoustic data may be
performed constantly. Alternatively, if the monitoring system determines that there is a traffic
accident, the correctness or not is confirmed. If the operator determines that a traffic accident is
detected, the sound of a car running away near the occurrence site is confirmed and the car is
tracked. Then, the photographed image of the intersection through which the car being traced
passes is confirmed, and the car or driver who has caused the traffic accident is identified. (3-2)
Monitoring of illegal dumping of waste The monitoring place is an open area, road, or Kawahara
where dumping of waste is a problem, and microphones are installed along those places and
surrounding roads to record sound. In addition, the camera is installed at the entrance of the
open space and the intersection of the surrounding road and the image is recorded.
[0031]
Since the event to be monitored is illegal dumping of waste, sounds related to the event include
the sound of trash being dumped, the sound of a car, the voice of a person, and the like. Vehicles
related to events are cars and people.
[0032]
A monitoring method when the monitoring system of the present invention is applied to
monitoring illegal dumping of waste is shown below. The operator confirms the sound associated
with illegal dumping of waste, as in the case of the above traffic accident. If the operator
determines that illegal dumping of waste is detected, the operator checks the sound of a car
running away near the dumping place, footsteps and voices of people, and tracks the cars and
people. Then, the photographed image at the entrance or intersection of the open space where
the car or person being tracked passes is checked to identify the car or person who illegally
10-05-2019
10
dumped the waste. (3-3) Monitoring of incidents on streets and residential areas The locations
for monitoring incidents are unpopular areas, roads with few traffic streets, and residential areas.
Make sounds and images.
[0033]
Since the event to be monitored is an incident, sounds related to the event include a broken
sound of a glass, a sound of a human being and a scream of a person. Vehicles related to events
are cars and people.
[0034]
A monitoring method in the case where the monitoring system of the present invention is applied
to surveillance of an incident is shown below. The operator confirms the sound associated with
the incident, as in the case of the above traffic accident. If it is determined that the operator has
detected an incident, the sound of the car running away near the occurrence site, the footsteps
and voice of the person are confirmed, and the car and person are tracked. Then, the captured
image at the intersection where the car or person being tracked passes is checked to identify the
car or person involved in the incident.
[0035]
In the above case, the time and place where the incident or accident occurred was first
determined, and then tracking and identification of moving objects such as cars and people
involved in the incident and accident were performed. Conversely, by monitoring the behavior of
suspicious vehicles and people retroactively, it is possible to identify the time, place, and cause of
an event or accident that could not be confirmed. Also, by tracking and monitoring moving
objects such as suspicious vehicles and people, it is possible to prevent occurrence of an incident
or an accident.
[0036]
According to the present invention, the operator can efficiently detect the event, track the
moving object, and confirm the characteristics using the sound of the event or moving object
being monitored and the image. .
10-05-2019
11
10-05-2019
12
Документ
Категория
Без категории
Просмотров
0
Размер файла
23 Кб
Теги
jp2002314987
1/--страниц
Пожаловаться на содержимое документа