close

Вход

Забыли?

вход по аккаунту

?

DESCRIPTION JP2014192553

код для вставкиСкачать
Patent Translate
Powered by EPO and Google
Notice
This translation is machine-generated. It cannot be guaranteed that it is intelligible, accurate,
complete, reliable or fit for specific purposes. Critical decisions, such as commercially relevant or
financial decisions, should not be based on machine-translation output.
DESCRIPTION JP2014192553
Abstract: It is possible to easily and widely provide parameters such as equalizer settings. An
information processing apparatus includes: a parameter input unit for inputting parameter
information for setting an operation state of a target apparatus; an image conversion unit for
generating converted image data obtained by imaging the parameter information; And a setting
file image generation unit configured to generate setting file image data arranged in image data
having a larger image size than the converted image data. The setting file image data is presented
as an image on an SNS or the like. By allowing the viewer to receive the image data, parameter
setting can be performed on various devices. [Selected figure] Figure 5
INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND
PROGRAM
[0001]
The present technology relates to an information processing apparatus, an information
processing method, and a computer program for imaging various parameters of an electronic
device or the like and handling a file obtained by imaging.
[0002]
JP, 2000-235772, A
[0003]
11-04-2019
1
In recent years, many electronic devices for audio equipment, video equipment, network
equipment and other entertainment have been sold.
These devices set and use various parameters for a specific function.
Usually, since various parameters are set by default, there is no problem in using the device even
if the user does not make special settings. On the other hand, if these parameters can be freely
set and the device can be used, the user can further expand the enjoyment of use of the device
and can use the device according to his / her preference.
[0004]
For example, in an audio device or the like, the frequency characteristic of the equalizer can be
set as the setting of the parameter. There are also devices that can automatically set an equalizer
in accordance with the genre (category) of music (see Patent Document 1). In this case, the
frequency characteristics are automatically adjusted for each genre such as jazz, classical music,
pops, etc., and the user can listen to music with the adjusted characteristics and can enjoy it.
[0005]
In addition, there is also a usage method in which each individual can set the parameter of the
equalizer aiming for unique sound creation according to his own judgment. By operating and
setting the equalizer regardless of the genre or the like to make the sound, it is possible to enjoy
the music in accordance with the user's preference or the like.
[0006]
By the way, it is conceivable to not only enjoy the content of the set equalizer by yourself but also
to share it with each other and enjoy it by each individual. If you can share examples of
recommended equalizer settings such as famous musicians, artists, creators, etc., you can expand
the user's enjoyment. In addition, if it is possible to share the contents of the equalizer setting
among friends who share the same music preference, it becomes a place of information
exchange, and active development of exchange is expected. The present technology aims to easily
11-04-2019
2
and widely provide parameters such as frequency characteristics of an equalizer set by a user, for
example, and to easily use such parameters.
[0007]
First, an information processing apparatus according to an embodiment of the present
technology includes: a parameter input unit that inputs parameter information for setting an
operation state of a target apparatus; and an image conversion unit that generates converted
image data obtained by imaging the parameter information. And a setting file image generation
unit configured to generate setting file image data in which the converted image data is arranged
in image data having a larger image size than the converted image data. By parameterizing the
parameter information into setting file image data, the parameter information can be widely
distributed.
[0008]
Secondly, in the information processing apparatus according to the present technology described
above, the setting file image data may further include identification image data obtained by
imaging identification information that identifies an image including the converted image data.
desirable. Thus, an application or the like that uses the setting file image data can recognize that
the image data is the setting file image data.
[0009]
Third, in the information processing apparatus according to the present technology described
above, it is preferable that the image conversion unit convert the parameter information into
color information or transparency information to generate the converted image data. Fourth, the
image conversion unit preferably converts the numerical value itself included in the parameter
information into color information or transparency information to generate the converted image
data. Fifth, it is preferable that the image conversion unit convert the numerical value included in
the parameter information into color information or transparency information according to a
conversion pattern to generate the converted image data. Sixth, the image conversion unit
preferably uses the selected or input conversion pattern. By these, it becomes possible to
enhance the visibility of the image itself by the setting file image data and to express the
parameter contents for the user who has visually recognized the image.
11-04-2019
3
[0010]
Seventh, in the information processing apparatus according to the present technology described
above, the setting file image data includes image data corresponding to the converted image data.
Eighth, the setting file image data includes an image presenting that it is an image including the
converted image data. Ninth, the setting file image data includes an image presenting the
parameter information creator. Tenth, the setting file image data includes an image presenting
the target device. Thus, in the setting file image data, it is possible to facilitate the visibility and
the user's grasp of the content when viewing the image.
[0011]
Eleventhly, in the information processing apparatus according to the present technology
described above, it is preferable that the parameter information is a parameter of equalizer
characteristics. This makes it possible to share parameters of an audio device, headphone device,
etc.
[0012]
The information processing method according to the present technology inputs parameter
information for setting an operation state of a target device, generates converted image data
obtained by imaging the parameter information, and the converted image data is larger than the
converted image data. The setting file image data to be arranged in the image data of the image
size is generated. A program of the present technology is a program that causes an information
processing apparatus to execute such an information processing method. The above-described
configuration is suitable for realizing an information processing apparatus that converts
parameter information into an image to obtain setting file image data.
[0013]
Further, in an information processing apparatus according to the present technology, that is, an
information processing apparatus using setting file image data, an image size of converted image
11-04-2019
4
data obtained by imaging parameter information for setting an operation state of a target
apparatus is larger than the converted image data. A setting file image acquisition unit for
receiving setting file image data arranged in the image data, a conversion image extraction unit
for extracting the conversion image data from the setting file image data, and parameter
information from the conversion image data And a parameter decoding unit to generate. An
information processing method according to an embodiment of the present technology is setting
file image data in which converted image data obtained by imaging parameter information for
setting an operation state of a target device is disposed in image data having an image size larger
than the converted image data. Are received, the converted image data is extracted from the
setting file image data, and parameter information is generated from the converted image data. A
program of the present technology is a program that causes an information processing apparatus
to execute such an information processing method. According to such an embodiment of the
present technology, it is possible to receive visualized setting file image data, convert it into
parameter information of a target device, and use it.
[0014]
According to the present technology, parameter information is handled as setting file image data,
and can be easily distributed on a network like photographs and illustrations, and parameter
information can be easily shared among many users. Become.
[0015]
It is a conceptual diagram in the case of setting an equalizer to a target device.
It is a conceptual diagram showing the example of sharing of setting information between users.
It is a conceptual diagram showing the example of sharing of setting information by a dedicated
server etc. It is a conceptual diagram showing system operation on SNS for share of setting
information. It is a block diagram of an information processor of an embodiment. It is a figure
which shows the specific setting waveform and numerical value for the equalizer setting of
embodiment. FIG. 16 is an explanatory diagram of a setting file image data generation example I
according to the embodiment. It is explanatory drawing of setting file image data generation
example II of an embodiment. It is explanatory drawing of setting file image data generation
example II of an embodiment. It is a flowchart of image generation operation of an embodiment.
It is a flowchart of image generation operation of an embodiment. It is a flowchart of the reading
operation of the embodiment.
11-04-2019
5
[0016]
Hereinafter, the contents of the present technology will be described in the following order.
<1. Parameter Sharing Method> <2. Configuration of Information Processing Apparatus> <3.
Setting file image data generation example I> <4. Setting file image data generation example II>
<5. Initial Communication with Target Device> <6. Setting file image data generation process> <7.
Process when using setting file image data> <8. Program and modification>
[0017]
<1. Method of Sharing Parameters> Hereinafter, a method for sharing a parameter for setting
specific functions of various electronic devices by a large number of users will be described with
reference to FIG. 1 to FIG. 4 with equalizer settings of audio devices (active headphones) as an
example. explain. In the case of the equalizer setting, the gain (dB) for each predetermined
frequency corresponds to the parameter. First, consider the case where the individual changes
the setting of the equalizer. FIG. 1 schematically shows an operation of changing the setting
content of the equalizer of the active headphone 3 (hereinafter referred to as a headphone). As
shown in FIG. 1, the user A edits the setting contents of the equalizer of the headphone 3 by the
equalizer characteristic editing apparatus 1 (PC terminal or smart phone terminal). This editing is
performed by equalizer characteristic editing software (not shown) operating on the equalizer
characteristic editing apparatus 1. The edited setting contents of the equalizer are stored in the
equalizer characteristic editing apparatus 1 as the setting file 2. The editing here may be a usercreated characteristic by modifying the manufacturer's recommended characteristics, or may be
newly created by the user from the beginning.
[0018]
The saved setting file 2 can be sent to the headphones 3 via, for example, Bluetooth (a wireless
standard and registered trademark), USB (Universal Serial Bus), and other transmission
standards. On the headphone 3 side, it is possible to change the setting of the received equalizer
as a new setting. Thus, the headphone 3 reproduces the sound with the setting of the set
equalizer. By the above operation, the user A can listen to music through the headphones 3 in
which the user-created characteristics are set, and can enjoy it. In this case, the setting content of
the equalizer is owned by an individual.
11-04-2019
6
[0019]
On the other hand, if users having the same headphones can share the setting file 2, music and
the like can be listened to using the characteristics created by other users, and the setting
characteristics of the other users can be enjoyed. That is, the same experience can be provided
between users (here, individual hearing differences are not considered). In particular, if the
person who made the setting is a prominent artist or the like, the characteristics that they set will
have special meanings, and it will be a great pleasure to listen to the music with the headphones
3 that set it. In addition, since the setting file 2 created by another person can be used if sharing
is possible, it is possible to save time and effort for creating the setting file 2.
[0020]
FIG. 2 shows an example of a sharing method in the case where the setting file 2 is shared among
users. As shown in FIG. 2, as a sharing method, it is conceivable to attach the setting file 2 to an
e-mail between individuals. For example, the setting file 2 is sent from the user A to the user B
and the user C by e-mail. Alternatively, the user B may send the setting file 2 received from the
user A to the user C by e-mail. Thus, the setting file 2 can be shared by the three users A, B, and
C. However, in this case, the scope of sharing is usually limited to one's friend / acquaintance.
[0021]
You can extend the scope of sharing by uploading the file to an ftp server or http server owned
or contracted by the user and publishing the URL (unifom resource locator) to SNS (social
networking service). FIG. 3 shows the conceptual diagram. The general flow of the operation is as
follows. As shown in FIG. 3, (i) The configuration file 2 is uploaded to a server 4
(ftp://xxx.ne.jp/yyy/zzz/ or a dedicated server) to which a certain user subscribes. (Ii) The user
who uploaded the configuration file 2 writes the uploaded URL on the SNS bulletin board. (Iii)
Another user sees the URL. (Iv) Another user downloads configuration file 2 from the viewed
URL. (V) The downloaded setting file 2 is set to the headphones 3.
[0022]
By the above operation, another user can listen to music with the equalizer setting represented
11-04-2019
7
by the setting file 2, and can enjoy the setting characteristics of another user. That is, users can
have the same experience. In this case, anyone can use the configuration file 2 if there is no
restriction on access to the URL. The range of sharing can be very wide compared to the case of
FIG. For example, the configuration file 2 can be shared quickly and flexibly with an unspecified
number of persons. However, in this case, it is necessary to set up and operate a special server,
and the person who opens and operates the server takes a considerable amount of time. In
addition, there is also a cost burden.
[0023]
In the present embodiment, a method is proposed that is easier, wider for the user, and easier to
use than the above sharing method. By displaying parameter information as an image and setting
file image data 6 as the setting file image data 6 on the bulletin board 5 on the SNS, the setting
file of parameters can be shared easily and simply. The system operation will be briefly described
with reference to FIG. In FIG. 4, the parameter setting created by the user A is set as setting file
image data 6x, and this is posted on the bulletin board 5 on the SNS. Then, it represents an
outline of an operation in which the user B downloads and uses the setting file image data 6x
posted on the SNS.
[0024]
As shown in FIG. 4, the information processing apparatus 1 such as a PC (personal computer) /
smart phone or the like includes an SNS application 11 and a setting editing application 12. The
SNS application 11 is software that enables access to the SNS. Thereby, predetermined
information can be posted on the bulletin board 5 on the SNS. The setting editing application 12
is software for creating setting file image data 6. Here, two types of setting file image data 6 of
setting file image data 6 x or setting file image data 6 y are created. The user A can create any
number of setting file image data 6 using the setting editing application 12. Then, for example,
the created setting file image data 6x can be posted on the bulletin board 5 on the SNS by the
SNS application 11.
[0025]
The user B looks at the bulletin board 5 and downloads the setting file image data 6x to the
information terminal (information processing apparatus 1) used by the user B. Then, it is possible
11-04-2019
8
to set the frequency characteristic setting content (hereinafter also referred to as “EQ
parameter”) of the equalizer included in the setting file image data 6x to the equalizer of the
headphone 3 and listen to the music in the set state. Although the setting file image data 6x is
imaged, it is also possible for the user B or the like to recognize the creator and the contents of
the EQ parameters by the image itself. The creator and the method of image generation for
content recognition will be described later. Then, for example, when the user B sees the setting
file image data 6x on the bulletin board 5, the user B recognizes that the setting file image data
6x is created by the user A and further recognizes what the setting contents of the equalizer are.
Can. Therefore, if the user A is a prominent musician, or if the setting content of the equalizer is a
setting that is interesting, other users may be interested in obtaining the setting file image data
6x. Also, for the user B and the like, listening to music with the setting content represented by
the setting file image data 6x further extends the enjoyment.
[0026]
The user A can use the created setting file image data 6x and 6y not only on the bulletin board 5
on the SNS but also on the headphones 3 used by himself. In this case, the EQ parameter created
by the user may not necessarily be the setting file image data 6 that has been imaged. That is,
although the EQ parameters themselves may be transferred to their own headphones 3, here, an
example using the setting file image data 6 will be described. As shown in FIG. 4, the header
phone 3 includes, for example, a setting RAM 21, an AD converter 22, a DSP (Digital Signal
Processor) 23, a DA converter 24, an output amplifier 25, and a speaker unit 26. In the figure,
the two-channel stereo configuration is shown in a simplified manner. An audio signal input from
an external DMP (Digital Media Player) 27 is converted into a digital signal by the AD converter
22, and the digital signal is processed by the DSP 23. In this case, since the contents of the EQ
parameter decoded from the setting file image data 6x are written in the setting RAM 21, the DSP
23 performs equalization processing using the EQ parameter. Then, the signal subjected to the
signal processing is converted into an analog signal by the DA converter 24, and is led from the
output amplifier 25 to the speaker unit 26. As a result, it is possible to listen to the reproduction
sound by the equalizer processing using the EQ parameter extracted from the setting file image
data 6 from the speaker unit 26. Here, although it demonstrated as the speaker 3 of the user A,
this operation | movement is the same also in the speaker 3 of the user B. FIG.
[0027]
The above is the outline of the system operation. In FIG. 4, the user A creates setting file image
data 6x and 6y, posts the setting file image data 6x on the bulletin board 5 on the SNS, the user B
11-04-2019
9
downloads it, and uses the setting file image data 6x. This is a mode in which this is set to the
headphone 3 to listen to music. Conversely, it is of course possible that the user B creates the
setting file image data 6, uploads it to the bulletin board 5 on the SNS, and the user A downloads
and uses it. Further, the present invention is not limited to the exchange of only two persons, and
anyone can upload the created setting file image data 6, and anyone can download the uploaded
setting file image data 6 and set it in the headphones 3. The range of sharing is wide as in the
case of FIG.
[0028]
<2. Configuration of Information Processing Device> The configuration of the information
processing device 1 according to the embodiment will be described with reference to FIG. The
information processing apparatus 1 is actually configured by, for example, a microcomputer
including a central processing unit (CPU), a read only memory (ROM), a random access memory
(RAM), various interfaces and the like, and a peripheral circuit. As shown in FIG. 4, it can be
realized as a personal computer, a tablet device, a smartphone or the like. FIG. 5 shows a
functional configuration for realizing the operation of the embodiment. 5A shows a functional
configuration for generating or uploading the setting file image data 6, and FIG. 5B shows a
functional configuration for downloading and using the setting file image data 6. Here, for
convenience, two figures are shown, but actually, as the information processing apparatus 1, one
configuration of information processing is provided by providing both configurations (processing
functions in the signal processing unit 50) with software or hardware. It can be realized as the
device 1. Of course, it is also possible to realize an apparatus dedicated to generation and upload
of setting file image data 6 and an apparatus dedicated to download and use.
[0029]
As shown in FIGS. 5A and 5B, the information processing apparatus 1 includes a control unit 35,
a storage unit 38, an operation unit 34, a display unit 36, a communication unit 37, and a signal
processing unit 50. The control unit 35 is configured by a microcomputer and controls each unit.
The storage unit 38 is used to store various data as a RAM, a non-volatile memory, or a drive of a
portable storage medium. The operation unit 34 represents various operation devices such as a
keyboard, a mouse, and a touch panel. The display unit 36 is a liquid crystal display device, an
organic EL (Electro Luminescence) display device, or the like, and presents various information to
the user. The communication unit 37 includes various communication devices such as a
communication unit of a public network such as the Internet, a Bluetooth communication unit, a
USB communication unit, a cable communication unit, an infrared communication unit, a wired /
11-04-2019
10
wireless communication unit of a public line, and a LAN interface unit. It shows comprehensively.
The information processing apparatus 1 can perform SNS access or communicate with peripheral
devices such as headphones via the communication unit 37. The signal processing unit 50 is
constituted by, for example, a DSP, a microcomputer, etc., and performs various arithmetic
processing.
[0030]
First, as shown in FIG. 5A, a parameter input unit 31, an image conversion unit 32, and a setting
file image generation unit 33 are provided in the signal processing unit 50 as functions for
generating and uploading the setting file image data 6. . The parameter input unit 31 takes in a
parameter that defines a specific function of the target device. Here, it is assumed that the EQ
parameter of the headphone 3 is taken. The EQ parameter may be created according to the user
operation by the operation unit 34, and this may be input to the parameter input unit 31, or a
parameter preset from the external headphone 3 may be transmitted via the communication unit
37. May be input.
[0031]
The EQ parameters input to the parameter input unit 31 are displayed on the display unit 36.
The operator can edit the EQ parameters by operating the operation unit 34 while looking at the
display unit 36. For example, parameter editing (or creation) is possible by touch operation on
the display panel of the display unit 36. The image conversion unit 32 generates converted
image data obtained by imaging the input or edited EQ parameters in a predetermined format.
The setting file image generation unit 33 generates setting file image data 6 in which the
converted image data is arranged in image data of an image size larger than the converted image
data. Specific examples of these processes will be described later.
[0032]
The generated setting file image data 6 is stored in the storage unit 38. The setting file image
data 6 stored in the storage unit 38 is transmitted, ie, uploaded, onto the SNS via the
communication unit 37. Thereby, the setting file image data 6 is posted on the bulletin board 5.
11-04-2019
11
[0033]
As shown in FIG. 5B, in the signal processing unit 50, a setting file image acquisition unit 41, a
converted image extraction unit 42, and a parameter decoding unit 43 are provided as a function
for downloading and using the setting file image data 6. . For example, when a user who views an
image of a setting file image data 6 performs an operation to download it on the bulletin board 5
of the SNS, the setting file image data 6 is downloaded and received by network communication
via the communication unit 37 Are stored in the storage unit 38. The setting file image
acquisition unit 41 of the signal processing unit 50 reads out and receives the received setting
file image data 6 from the storage unit 38.
[0034]
The converted image extraction unit 42 extracts converted image data from the setting file image
data 6. As described above, the setting file image data 6 is data in which converted image data
obtained by imaging the EQ parameter information is disposed in image data having a larger
image size than the converted image data. Therefore, the converted image extraction unit 42
extracts a portion of converted image data obtained by imaging the EQ parameter from the
setting file image data 6. The parameter decoding unit 43 generates an EQ parameter from the
extracted converted image data. In this case, decoding processing corresponding to the encoding
of imaging in the image conversion unit 32 of FIG. 5A is performed to obtain an original EQ
parameter. The EQ parameters thus obtained are stored in the storage unit 38. The EQ
parameters stored in the storage unit 38 are transmitted from the communication unit 37 to the
headphones 3 in accordance with the user's operation or the like, and are used as equalizer
settings of the headphones 3.
[0035]
<3. Setting File Image Data Generation Example I> An example of setting file image data 6 in
imaging according to the embodiment will be described below with reference to FIGS. 6 and 7.
Imaging can be imaged for any type of parameter. Here, imaging of parameters for equalizer
setting of the headphone 3 will be described as an example. FIG. 6 shows a specific setting
waveform (frequency characteristic) for equalizer setting and a specific numerical value thereof.
FIG. 6A shows the setting contents for equalizer setting in the form of waveforms. The horizontal
axis represents frequency (Hz) and the vertical axis represents gain (dB). This waveform can be
created and edited by the user from the beginning in the information processing apparatus 1, or
11-04-2019
12
the equalizer setting content can be taken into the information processing apparatus 1 from the
headphone 3 and created and edited from the setting content. The user of the information
processing apparatus 1 can draw a waveform (frequency characteristic) as an EQ parameter by
touch panel operation or the like. The frequency characteristics shown by thin lines in FIG. 6A
represent a state in which the waveform is changed by touch panel operation (movement of a
fingertip). The locus of the movement of the user's fingertip corresponds to the frequency
characteristic of the audio signal processing desired by the user, and the setting closest to the
one sought by the user is realized according to the resource (number of bands, upper limit, etc.) .
The waveform shown by the thick line in the figure is the finally determined frequency
characteristic value.
[0036]
FIG. 6B shows specific numerical values for each frequency of the waveform of FIG. 6A. x is the
coordinate of the horizontal axis when the left end is 0 on the screen. y is the coordinate of the
vertical axis directed downward when the upper end is 0 on the screen. Hz and dB are values
represented by converting the coordinates of x and y into frequency (Hz) and gain (dB). However,
this numerical value is an example and is not a numerical value corresponding to the waveform
of FIG. 6A.
[0037]
FIG. 6B is the value of the frequency characteristic set by the thick line. The x value and the y
value are the x coordinate value and the y coordinate value of the frequency characteristic
drawing area of the display unit 36. The information processing apparatus 1 acquires, for
example, y coordinate values as gain values (dB) for each x coordinate value (0, 5, 10,...) Of a
fixed point (for example, 128 points) on the x coordinate. The y-coordinate value of each xcoordinate value is the coordinate value of the thick line in FIG. 6B, and this value is taken as a
gain value (dB). Each point on the x coordinate corresponds to a frequency value (Hz). For
example, each frequency value in the range of 20 Hz to 22.05 KHz corresponds to each point on
the x coordinate.
[0038]
Here, conventional imaging will be mentioned. If the parameter information is for digital device
11-04-2019
13
use, it can be a standardized file such as XML. It can also be in a device-specific format. However,
depending on the system, the types of files that can be uploaded are limited and inconvenient. A
method of using a QR code (registered trademark) is also conceivable. The QR code can be
captured by a camera by imaging information. The user can understand the import method at a
glance by looking at the QR code, and it is easy to use and convenient. However, the amount of
information is small. Although a method of embedding a parameter in a TAG area can be
considered, the user can not know the embedded information at all only by the usual means
(action of viewing an image). It may be possible to indicate that it is a configuration file with
some symbol etc., but the difference is unknown. Moreover, this method is not realized in the
first place in the image format in which the TAG area does not exist. When the frequency
characteristic of FIG. 6 is to be edited, a method of directly storing the characteristic as an image
may be considered. However, when the size is large and reduced, it is difficult to see, and it is
difficult for the user to recognize the characteristics of the frequency characteristic at a glance.
Also, a method of inserting WaterMark may be considered, but this also does not allow the user
to recognize differences in parameters set at a glance.
[0039]
Therefore, in the present embodiment, the following imaging is performed. Typical image file
formats are Microsoft Windows Bitmap Image (BMP), Joint Photographic Experts Group (JPEG),
Graphics Interchange Format (GIF), and Portable Network Graphics (PNG). Here, PNG will be
described as an example. The storage compression format may be any format other than PNG, as
long as the original color information can be completely and correctly restored from the saved
file. In the case of imaging, it is also possible to adopt a method in which the gain value of each
band of the target device is taken as a target. However, when setting file 2 for 31-band equalizerequipped headphones is read and the setting is reflected on the 3-band equalizer-equipped
headphones, exception processing must be performed on the headphone side. That is, the image
may be greatly changed according to the resources of the headphones. Therefore, in the present
technology, in the arrangement of the horizontal axis and the vertical axis, the horizontal axis
direction is a fixed interval (that is, for example, 128 points above), and imaging is performed on
the value of the vertical axis (gain value (dB)
[0040]
The imaging method and the imaging example of the embodiment will be specifically described
with reference to FIG. Imaging is realized in two stages. First, the gain of the target vertical axis is
converted to color image data in accordance with the value of the gain. This is conversion image
11-04-2019
14
data generated by the image conversion unit 32. Next, image data including converted image
data (color image data) is filed in a predetermined format (PNG format). This is the setting file
image data 6 generated by the setting file image generation unit 33.
[0041]
First, the converted image data generated by the image conversion unit 32 will be described. The
image conversion unit 32 converts the numerical value itself included in the parameter
information into color information or transparency information to generate converted image
data. Specifically, the gain value is directly converted to color information. The color can be
expressed by combining R (red), G (green), B (blue), and α (transparency). Here, R (red), G
(green), B (blue), and α (transparency) are combined, converted to a predetermined color, and
imaged. Of course, the three primary colors R (red), G (green) and B (blue) may be combined.
[0042]
FIG. 7A shows specific numerical values required to image the setting contents of FIG. That is,
FIG. 7A represents the gain value (dB) at each point on the horizontal axis of FIG. 6 in the form of
a table as a numerical value for imaging. Numbers are in 32-bit floating point notation. Only 19
points are shown in the figure. The 32-bit value representing the gain value (dB) in FIG. 7A is
decomposed into eight bits and allocated to R, G, B, and α, respectively. For example, the value of
the first point is "C080AAAAh" ("h" indicates hexadecimal notation). In this case, C0h is allocated
to the α value, 80h to the R value, AAh to the G value, and AAh to the B value. Since each value
of R, G, B, and α is expressed by 8 bits, the maximum is 255 in decimal. This value corresponds
to one pixel on the screen. That is, the color of one pixel correlated to the gain (dB) is determined.
[0043]
Thus, the gain value of 128 points is converted into color information. This forms 128 pixels of
image data. FIG. 6B shows an example of converted image data 6a of 1 pixel × 128 pixels.
Although the drawing is drawn vertically for illustration, the converted image data 6a is actually
color information for one line. Although the setting file image data 6 can be realized only by
storing the converted image data 6a as shown in FIG. 7B in PNG format, the user can not visually
recognize the image of 1 pixel × 128 pixels. In addition, it is not possible to determine whether
11-04-2019
15
an application program that uses the setting file image data 6 and whether this image file is a file
for parameter setting.
[0044]
Therefore, as a second step, the setting file image generation unit 33 generates the setting file
image data 6 in which the converted image data 6a described above is disposed in the image data
of an image size larger than the converted image data 6a. It is assumed that the data can be
viewed by the user as an image. The setting file image generation unit 33 includes identification
image data obtained by imaging identification information identifying that the setting file image
data 6 is an image including converted image data.
[0045]
First, identification image data for causing the setting file image data 6 to be recognized on an
application will be described. Specifically, the identification tag is embedded in the setting file
image data 6. In this case, embedding is not on the text area or the like but on the image. The
identification tag uses a universally unique identifier (UUID) or the like. The value of this
identification tag is converted to RGBα and imaged, as in the color imaging described above. In
this case, the degree of transparency may be maximized so as to be hidden from view. FIG. 7C
illustrates identification image data 6b obtained by imaging the identification tag. (In the figure,
enlargement and vertical stretching are performed for the purpose of explanation.) Further, as
information of the identification tag to be imaged in this way, only information indicating that the
image is the setting file image data 6 in the present embodiment In addition, information
indicating the target device (headphones 3 etc.) is also added. For example, information that can
identify the device, such as the UUID of the target device, is placed as data in the identification
tag.
[0046]
In order to enable the user to recognize the setting file image data 6 as an image, the setting file
image generation unit 33 generates the setting file image data 6 including the converted image
data 6 a and the identification image data 6 b. An example is shown in FIG. 7D. The setting file
image data 6 is image data of an image size larger than the converted image data 6a. For
example, as shown in FIG. In this example, it is assumed that the square-shaped image is
11-04-2019
16
generated so as to extend the 128-pixel converted image data 6a vertically while applying
gradation. For example, the converted image data 6a is arranged in the first row, and data
obtained by gradually converting the α value for each value of the converted image data 6a of 1
× 128 pixels is arranged in the second to 128th rows. . This gives an image of 128 × 128
pixels. Further, identification image data 6b is arranged at a predetermined position (shown by a
broken line in the figure).
[0047]
By doing this, as a result, the setting file image data 6 has the converted image data 6a having
the information of the EQ parameter and the identification image having the information of the
identification tag in the image data of the image size larger than the converted image data 6a.
The data 6 b is image data arranged at predetermined positions. Such image data 6 stored in PNG
format, for example, becomes setting file image data 6. The image according to the setting file
image data 6 is a substantially square image that can be viewed by the user. Although the squareshaped image is generated using the converted image data 6a in the above example, various
examples can be considered. In the example of FIG. 7D, the rendering effect of the image is
obtained by applying the transparency gradation in the vertical direction, but of course the data
of 128 pixels may be simply copied to each row without gradation. In addition, the identification
image data 6b may be disposed at the bottom to make a gradation between upper and lower two
colors. Alternatively, the conversion image data 6a and the identification image data 6b may be
arranged in a part of predetermined image data such as a square shape without using the
conversion image data 6a. In addition, the identification image data 6b may be converted into
values of only R, G, and B, and the transparency α may be maximized so as to be hidden from
the user's eyes. Of course, the image shape and size of the setting file image data 6 are not
limited either. Further, the arrangement positions of the converted image data 6a and the
identification image data 6b can be considered variously. In any case, the application may be able
to find the converted image data 6a and the identification image data 6b in the setting file image
data 6 and acquire information.
[0048]
Even in the setting file image data 6 as shown in FIG. 7D, it is possible for the user to recognize it
as a file in which parameter settings are imaged and stored, but it is also possible to improve user
recognition. For example, a logo indicating that the image includes the converted image data 6a
may be attached as an image for the function of the setting file image data 6, that is, the function
(service) enabling sharing of parameter information. FIG. 7E shows setting file image data 6 d in
11-04-2019
17
which a mark “MY OWN” is arranged as a logo. Of course the logo is an example. By arranging
the logo image in this manner, when the user views the image on the bulletin board 5 or the like,
it is easy to recognize that this is the setting file image data 6 that enables sharing of parameter
information.
[0049]
FIG. 7F shows setting file image data 6 to which an image of a target apparatus is added. This
display enables the user to recognize the target device at a glance. By recognizing the target
device, the user can determine whether he / she can use the setting file image data 6 or not.
[0050]
FIG. 7G shows setting file image data 6 to which an image for presenting a parameter
information creator, such as a creator avatar or a photograph, is added. By doing this, the user
can easily recognize the creator of the setting file image data 6. Also, the creator can enjoy
creating the setting file image data 6, and the user can recognize the creator and determine
whether to use it.
[0051]
<4. Setting file image data generation example II> In the example shown in FIG. 7 as the
setting file image data generation example I, the image conversion unit 32 converts the
numerical value itself included in the parameter information into color information or
transparency information. Although the converted image data 6a is generated, as the setting file
image data generation example II, the image converting unit 32 converts the numerical value
included in the parameter information into color information or transparency information
according to the conversion pattern to generate converted image data. An example will be
described with reference to FIG. 8 and FIG.
[0052]
The converted image data 6a described with reference to FIG. 7 is image data in which the
11-04-2019
18
numerical value of the setting gain of the equalizer is distributed to color information of R, G, B,
and α, and converted to a color corresponding to the distributed numerical value. In this case, if
the numerical value is large, the luminance of the color is simply increased and converted. For
example, if the value of R is large, the brightness of red is high. On the other hand, color
conversion is performed by changing the values of R, G, B, and α in a certain relationship. In the
following, a setting for changing the values of R, G, B, and α with a certain fixed relationship is
called a conversion pattern.
[0053]
In the conversion pattern of FIG. 8A, the smaller the gain value is, the higher the luminance of B
(inverse relation), and the luminance of R is suppressed (proportional relation). Further, as the
gain value is larger, the luminance of R is increased (proportional relation), and the luminance of
B is suppressed (inverse proportion). And G is fixed to the gain. However, the size of G can be
selected (variable) by the user. The setting file image data 6 shown in FIG. 8A is a display
example when the setting contents of the table in the lower part of the drawing are imaged as
described above for the B, R, and G conversion patterns.
[0054]
Since the figure is displayed in black and white, the relationship between the setting contents and
the display is unclear but it is roughly as follows. In the figure, the portion displayed lightly is the
portion displayed with high red brightness, and the portion displayed dark is the portion
displayed with high blue brightness. In this case, the band 1 (lower band side) has a gain of 2.4
dB, and since the value is large, red is emphasized and lightened. Bands 2 and 3 have a low gain
(-2.4 dB, -4.8 dB) and are highlighted blue. Band 4 has a large gain (2.4 dB) and is emphasized by
red and displayed thin. Band 5 has a low gain (-9.6 dB) and is emphasized in blue and displayed
dark. Bands 6 and 7 have large gains (7.2 dB, 12 dB), with red being emphasized and lightened.
Band 8 has a low gain (-4.8 dB) and is emphasized in blue and displayed dark. Bands 9 and 10
(high band side) have large gains (4.8 dB, 7.2 dB), and red is emphasized and displayed thin.
Thereby, the characteristic of the frequency characteristic can be flexibly expressed using the
color difference. In fact, since color display is used, differences can be expressed more finely. If
the user recognizes the tendency of color tone based on the conversion pattern, the tendency of
available EQ parameter settings can be roughly grasped by looking at such setting file image data
6. For example, frequency characteristics such as emphasis on high frequency, emphasis on low
frequency, and generally flat can be recognized.
11-04-2019
19
[0055]
In the conversion pattern of FIG. 8B, when the gain is smaller than zero, the display of R is set to
zero, and as the gain is smaller, the luminance of B is increased (inverse relation). If the gain is
greater than zero, B is set to zero, and the larger the gain, the higher the luminance of R
(proportional relation). As for G, as the gain is smaller than zero, the luminance is suppressed
(inverse relation), and as the gain is larger than zero, the luminance is suppressed (inverse
relation). Further, R or B and G are added to be a constant value (for example, 255). The
magnitude of G when the gain is zero can be selected (variable) by the user.
[0056]
In FIG. 8B, the lightly displayed portion is a portion displayed with red emphasis, and the darkly
displayed portion is a portion displayed with blue emphasis. This is similar to FIG. 8A. The
portion displayed in the middle of R and B is the portion highlighted with green. By the
conversion patterns of R, B, and G described above, the relationship between the setting contents
and the display is approximately as follows. G is emphasized when the gain is close to zero. When
the gain is smaller than zero, R is zero, so it becomes a mixed color of B and G, and as the gain
decreases, B is emphasized. When the gain is larger than zero, B is zero, so R and G become a
mixed color, and as the gain increases, R is emphasized.
[0057]
The gains of bands 1 (lower band side), 2, 3, and 4 are close to zero, so G is displayed in an
emphasized manner, and is displayed with an intermediate density of R and B. The gain of band 5
(-9.6 dB) is small, B is emphasized and displayed dark. The gains of bands 6 and 7 (7.2 dB, 12 dB)
are large, and R is emphasized and displayed thin. Since the gain of band 8 (-4.8 dB) is lower than
the gains of bands 1, 2, 3, 4, blue is emphasized and displayed a little darker. Since the gains of
bands 9 and 10 (4.8 dB and 7.2 dB) are smaller than the gains of bands 6 and 7, R is emphasized,
but they are displayed not as thin as bands 6 and 7 and slightly darker.
[0058]
11-04-2019
20
Similar to FIG. 8A, the characteristics of the frequency characteristic can be flexibly expressed
using color differences. In fact, since it is a color display, it can express more detailed differences.
As a result, the user who views the image of the setting file image data 6 can know the tendency
of the setting contents of the EQ parameter.
[0059]
FIG. 9 shows another example of the arrangement of the conversion pattern and the conversion
image data 6a. In FIG. 8, the display from the low band to the high band of the equalizer as the
converted image data 6a is disposed on the horizontal axis, but in FIG. In the example of FIG. 9A,
the central portion of the display is a low band, and the setting content is color-displayed in the
form of a radial circle toward the high band on the diagonal. Color display is displayed by
combining R, G, B, and α. The setting of the equalizer shows an example as 10 bands as in FIG.
The setting values are as shown in the table on the right of the screen and are the same as in FIG.
Similarly, the gain (dB) that can be set to the equalizer is set to -12 dB to +12 dB.
[0060]
The conversion pattern of FIG. 9A is a state in which R is fixed at the maximum value of the
variable range and B and G are zero, and α is larger (inverse proportion relation) as the setting
gain is smaller. The display when imaging in this state is a display in which α (transparency)
changes in accordance with the setting gain of the equalizer based on R. Since FIG. 9A is a black
and white display, the part displayed darkly corresponds to R (red), and the part displayed light is
a part where α (transparency) is large. Therefore, when the setting value of the equalizer is
small, the display is thin and when the setting value is large, the display is dark. The relationship
between the setting contents and the display in this case is approximately as follows. Since the
center of the screen is on the low frequency side, band 1 corresponds to the center. Then, the
bands 2 to 10 correspond to the diagonal lines, and the apexes of the diagonal lines correspond
to the band 10.
[0061]
The conversion pattern of FIG. 9B is the same as that of FIG. 9A for R and α, but for G, the
horizontal axis is added as a band 1 to band n. That is, G is increased from the low band side
toward the high band side. The display when imaging is performed in this state is a display in
11-04-2019
21
which the tendency of G becomes larger as it goes to the higher frequency than the display
example of FIG. 9A. FIG. 9C is an example in which the conversion pattern is the same as that in
FIG. 9B but the arrangement of the conversion image data 6a is radial in the form of a square.
[0062]
As described above, by setting the color information R, B, G and α as variables and setting them
variously, it is possible to change the impression for viewing the setting file. This makes it
possible to express the features of the one that creates the setting file image data 6. For example,
it is also possible to express the creator (or the target device) or the type of EQ parameter
(parameter for jazz, parameter for pop, etc.) by the conversion pattern or the arrangement of the
converted image data 6a. The user who creates the setting file image data 6 can also select the
above conversion pattern or download the conversion pattern itself.
[0063]
<5. Initial Communication with Target Device> Hereinafter, a processing example of the
information processing device 1 according to the embodiment will be described. First, here, initial
communication with a target device such as the headphone 3 will be described with reference to
FIG. FIG. 10 shows a process in which the information processing device 1 receives
predetermined information from a target product (headphones or the like) based on control of an
application program started by the control unit 35 in the information processing device 1.
[0064]
At step S1, the control unit 35 inquires of the headphone 3 about the model information via the
communication unit 37. In step S2, under the control of the control unit 35, model information
(for example, UUID) sent from the headphones 3 is received via the communication unit 37. In
step S3, the control unit 35 analyzes the received model information. In step S4, the control unit
35 performs drawing setting corresponding to the equalizer configuration (setting of the number
of bands and the like).
[0065]
11-04-2019
22
For example, when such a communication is performed, the information processing apparatus 1
reads the current EQ parameters of the headphone and causes the display unit 36 to display the
current EQ parameters when the setting file image data 6 is to be created. It becomes possible to
provide for editing. In addition, in the case of the information processing apparatus 1 which
attempts to download and use the setting file image data 6, it is possible to check whether the
target model of the communication partner can use the parameter information of the setting file
image data 6 or not. .
[0066]
<6. Setting File Image Data Generation Processing> Next, an example of processing for
generating setting file image data 6 will be described with reference to FIGS. 11A and 11B. FIG.
11A corresponds to the setting file image data generation example I described above, and FIG.
11B corresponds to the setting file image data generation example II.
[0067]
First, the process of FIG. 11A will be described. In step S11, a file name is input under the control
of the control unit 35. In this case, the file name can be automatically input, or can be manually
input by the operation of the operation unit 34. When the file name is input by either automatic
input or manual input, the process of the control unit 35 proceeds to step S12. In step S12,
under the control of the control unit 35, the display unit 36 executes graphic display of the
characteristics of the equalizer. For example, the EQ parameters read from the headphone or the
like by communication as shown in FIG. 10 are displayed for editing, or an image for creating a
new EQ parameter is displayed. In response to this, the user inputs an operation and creates and
edits an EQ parameter desired. When the newly created or edited parameter information as a
source of generating the setting file image data 6 is decided, it is taken into the parameter input
unit 31 of the signal processing unit 50 in step S13.
[0068]
In step S14, the signal processing unit 50 generates the converted image data 6a under the
control of the control unit 35. That is, for the created parameter information, the image
conversion unit 32 of the signal processing unit 50, for example, distributes the gain value for
11-04-2019
23
each 32-bit data for each frequency into α, R, G, and B values for conversion as described above
Image data 6a is generated. In steps S15 to S18, the signal processing unit 50 generates setting
file image data 6 by the function of the setting file image generation unit 33. Therefore, in step
S15, the signal processing unit 50 arranges the converted image data 6a in the image as the
setting file image data 6. For example, as shown in the example of FIG. 7, the converted image
data 6a is included in part of the square setting file image data 6. In step S16, the signal
processing unit 50 arranges the identification image data 6b in the image data. In step S17, a
logo mark, a target device image, an avatar or the like is arranged in the image as a visual
recognition symbol. Then, in step S18, the signal processing unit 50 compresses the image data
generated by the above processing in PNG format.
[0069]
In step S19, the control unit 35 stores the setting file image data 6 generated by the signal
processing unit 50 as described above in the storage unit 38. The setting file image data 6 is
generated by the above procedure.
[0070]
Next, FIG. 11B will be described. FIG. 11B is different from FIG. 11A only in that step S21 is
added. In step S21, a conversion pattern to be used is determined. The conversion pattern may
be determined by user selection, or may be selected according to the music genre, the target
device, and the like. Alternatively, always a fixed transformation pattern may be used. Steps S22
to S30 are the same as steps S11 to S19 in FIG. 11A. However, in the generation process of the
converted image data 6a in step S25, the conversion pattern determined in step S21 is used, and
the imaging as described in FIGS. 8 and 9 is performed.
[0071]
<7. Process when using setting file image data> Subsequently, a process when using the
setting file image data 6 in the information processing apparatus 1 according to the embodiment
will be described with reference to a flowchart of FIG. The use here is an operation from setting
the EQ parameter of the setting file image data 6 to the external headphone 3 using the setting
file image data 6 downloaded to the information processing apparatus 1 from the bulletin board
5 on the SNS. It is The processing for this is realized by causing the control unit 35 to execute an
11-04-2019
24
application program in the information processing apparatus 1.
[0072]
FIG. 11 is a flow chart showing a procedure of analyzing the setting file image data 6
downloaded by the information processing apparatus 1 and setting the contents of the equalizer
setting of the setting file image data 6 in the target product (headphones 3 etc.). Note that, as a
premise of FIG. 11, the information processing apparatus 1 downloads, for example, the setting
file image data 6 presented on the SNS by network communication via the communication unit
37 and stores the image file as the setting file image data 6 It shall be stored in 38.
[0073]
In step S101, the control unit 35 opens and reads the image file (setting file image data 6)
downloaded and stored in the storage unit 38. In step S102, the control unit 35 determines
whether the image file is an image file as the setting file image data 6. This is processing to
confirm whether the identification image data 6b of the file is included and the data content is
appropriate information.
[0074]
If the image file to be processed is not the setting file image data 6, the process proceeds to step
S110, a warning message is displayed on the display unit 36, and the process ends. When the
image file to be processed is confirmed as the setting file image data 6, the process proceeds to
step S103, and the control unit 35 confirms the object model (headphones 3 etc.) from the
information of the identification image data 6b.
[0075]
In step S104, the control unit 35 determines whether or not the model of the headphone 3
connected in communication (or connectable) is the same as the target model checked in step
S103. The model of the headphone 3 connected is acquired by the processing of steps S1 and S2
in the initial communication of FIG. Therefore, in the processing here, it is checked whether the
11-04-2019
25
target model confirmed in step S103 matches the information of the model acquired in the
processing of steps S1 and S2 of FIG. If the model of the connected headphone 3 and the
corresponding model of the setting file image data 6 do not match, the process proceeds to S108,
where a warning message is displayed on the display unit 36, and the process proceeds to step
S109. In step S109, the user is notified that the device that can communicate is not the model
corresponding to the setting file image data 6 this time, and the user is requested to make a
selection. This is because EQ parameters and the like may be used even if they do not correspond
directly. If the user selects to continue the process, the process proceeds to step S105. If it is not
selected to continue the process, the process ends.
[0076]
In step S105, the control unit 35 reads the converted image data 6a from the setting file image
data 6, and causes the signal processing unit 50 to decode the converted image data 6a. That is,
the converted image data 6a is taken into the signal processing unit 50 from the storage unit 38
by the function of the setting file image acquisition unit 41 of FIG. 5B. Then, the converted image
data 6 a is extracted from the setting file image data 6 by the function of the converted image
extraction unit 42. Further, the function of the parameter decoding unit 43 generates an EQ
parameter from the converted image data 6a. That is, the decoding process corresponding to the
encoding of imaging in the image conversion unit 32 of FIG. 5A is executed. Specifically, the
values of α, R, G, and B of 1 × 128 pixels are returned to the gain value, and this is used as the
gain value of 128 points on the frequency axis. As a result, EQ parameter information of the
equalizing characteristic with the original gain value of 128 points can be obtained.
[0077]
In step S106, the control unit 35 further causes the signal processing unit 50 to reproduce the
frequency characteristic corresponding to the resource of the target device (headphones) to be
communicated. For example, although the encoded EQ parameters become frequency
characteristic information of 128 bands, if the equalizer of the headphone 3 has a 6-band
configuration, the encoded EQ parameters are converted into 6 band EQ parameters. The control
unit 35 causes the signal processing unit 50 to execute the above processing to reproduce the
EQ parameters for the headphone 3 and stores the EQ parameters in the storage unit 38. Then,
in step S107, the communication unit 37 transmits the stored EQ parameters to the headphones
3. In response to this, on the headphone 3 side, as shown as step S111, the setting of the
equalizer is changed to a new EQ parameter. According to the above-described procedure,
parameter settings of a target device such as the headphone 3 can be changed using the setting
11-04-2019
26
file image data 6 downloaded to the information processing device 1. Through such processing,
as described above, the user can change the setting of his / her headphones 3 using the setting
file image data 6 provided to the public.
[0078]
<8. Program and Modification> The program according to the embodiment is a program that
causes an arithmetic processing unit such as a CPU or DSP to execute the processing of the
information processing apparatus 1 described in the above-described embodiment. That is, the
program for executing the process of generating the setting file image data 6 includes a process
of inputting parameter information for setting an operation state of a target apparatus, and a
process of generating converted image data 6a obtained by imaging the parameter information.
The information processing apparatus causes the information processing apparatus to execute
processing for generating setting file image data 6 in which the converted image data 6a is
disposed in the image data of an image size larger than the converted image data 6a. Specifically,
this program may be a program that causes the processing unit to execute the processing shown
in FIG. 11A or 11B. The program for executing the process using setting file image data 6
includes a process of receiving setting file image data 6, a process of extracting converted image
data 6a from setting file image data 6, and parameter information from converted image data 6a.
And causing the information processing apparatus to execute the process of generating
Specifically, this program may be a program that causes the processing unit to execute the
processing shown in FIG.
[0079]
With these programs, the information processing apparatus 1 described above can be realized
using an arithmetic processing unit. Such a program can be recorded in advance in an HDD as a
recording medium incorporated in an apparatus such as a computer device, a ROM in a
microcomputer having a CPU, or the like. Alternatively, a flexible disc, a compact disc read only
memory (CD-ROM), a magneto optical disc (MO), a digital versatile disc (DVD), a Blu-ray disc
(registered trademark), a magnetic disc, a semiconductor memory, It can be stored (recorded)
temporarily or permanently in a removable recording medium such as a memory card. Such
removable recording media can be provided as so-called package software. The program may be
installed from a removable recording medium to a personal computer or the like, and may also
be downloaded from a download site via a network such as a LAN (Local Area Network) or the
Internet.
11-04-2019
27
[0080]
Moreover, according to these programs, it is suitable for wide provision of the information
processing apparatus 1 according to the embodiment. For example, the program is downloaded
to a personal computer, a portable information processing apparatus, a portable telephone, a
game machine, a video apparatus, a PDA (Personal Digital Assistant), etc. The apparatus 1 can be
used.
[0081]
Although the embodiment has been described above, various modifications can be considered.
Although the parameter has been described by way of example of the setting contents (EQ
parameter) of the equalizer of the headphone 3, various other methods can be considered. For
example, in the case of headphones, system components, AV receivers, car audio, and other audio
equipment, reverb setting parameters, echo setting parameters, flanger effect parameters, phase
shift effect parameters, panpot setting parameters, multichannel gain setting parameters,
Examples of virtual sound space setting parameters, setting parameters for bass emphasis and
wide area interpolation, parameters for noise cancellation filter, DJ effect parameters, and the
like can be considered. In the case of noise canceling headphones etc. parameters such as noise
filtering optimum filters for vehicles with certain railway routes, such as noise canceling
optimum filters according to the car type of the vehicle in the car audio system, under specific
conditions The present technology is suitable for sharing. Of course, parameters of settings (such
as image quality mode) for video signals may be used. Therefore, the present technology can be
applied to setting of specific functions of various types of audio, video, entertainment equipment,
and the like.
[0082]
The present technology can also adopt the following configuration. (1) A parameter input unit for
inputting parameter information for setting an operation state of a target device, an image
conversion unit for generating converted image data obtained by imaging the parameter
information, and the converted image data is the converted image An information processing
apparatus comprising: a setting file image generation unit configured to generate setting file
image data arranged in image data having a larger image size than data. (2) The information
processing apparatus according to (1), wherein the setting file image data further includes
11-04-2019
28
identification image data obtained by imaging identification information that identifies the image
including the converted image data. (3) The information processing apparatus according to (1) or
(2), wherein the image conversion unit converts the parameter information into color information
or transparency information to generate the converted image data. (4) The information
processing apparatus according to (3), wherein the image conversion unit converts the numerical
value itself included in the parameter information into color information or transparency
information to generate the converted image data. (5) The information processing apparatus
according to (3), wherein the image conversion unit converts the numerical value included in the
parameter information into color information or transparency information according to a
conversion pattern to generate the converted image data. (6) The information processing
apparatus according to (5), wherein the image conversion unit uses the selected or input
conversion pattern. (7) The information processing apparatus according to any one of (1) to (6),
wherein the setting file image data includes image data corresponding to the converted image
data. (8) The information processing apparatus according to any one of (1) to (7), wherein the
setting file image data includes an image presenting that the image includes the converted image
data. (9) The information processing apparatus according to any one of (1) to (8), wherein the
setting file image data includes an image presenting the parameter information creator. (10) The
information processing apparatus according to any one of (1) to (9), wherein the setting file
image data includes an image presenting the target apparatus. (11) The information processing
apparatus according to any one of (1) to (10), wherein the parameter information is a parameter
of an equalizer characteristic. (12) Parameter information for setting the operation state of the
target device is input, converted image data is generated by imaging the parameter information,
and the converted image data is image data of an image size larger than the converted image
data. An information processing method for generating setting file image data arranged in the
inside.
(13) a process of inputting parameter information for setting an operation state of a target
apparatus, a process of generating converted image data obtained by imaging the parameter
information, an image in which the converted image data is larger than the converted image data
A program for causing an information processing apparatus to execute a process of generating
setting file image data arranged in image data of a size. (14) A setting file for receiving setting file
image data in which converted image data obtained by imaging parameter information for setting
an operation state of a target device is disposed in image data of an image size larger than the
converted image data An information processing apparatus comprising: an image acquisition
unit; a converted image extraction unit that extracts the converted image data from the setting
file image data; and a parameter decode unit that generates parameter information from the
converted image data. (15) receiving setting file image data in which converted image data
obtained by imaging parameter information for setting an operation state of a target device is
disposed in image data having a larger image size than the converted image data, An information
processing method comprising: extracting the converted image data from setting file image data;
11-04-2019
29
and generating parameter information from the converted image data. (16) A process of
receiving setting file image data in which converted image data obtained by imaging parameter
information for setting an operation state of a target device is disposed in image data of an image
size larger than the converted image data A program that causes an information processing
apparatus to execute a process of extracting the converted image data from the setting file image
data and a process of generating parameter information from the converted image data.
[0083]
DESCRIPTION OF SYMBOLS 1 ... Information processing apparatus, 3 ... Headphones, 31 ...
Parameter input part, 32 ... Image conversion part, 33 ... Setting file image generation part, 34 ...
Operation part, 35 ... Control part, 36 ... Display part, 37 ... Communications part, 38: storage
unit, 41: setting file image acquisition unit, 42: converted image extraction unit, 43: parameter
decoding unit
11-04-2019
30
Документ
Категория
Без категории
Просмотров
0
Размер файла
51 Кб
Теги
description, jp2014192553
1/--страниц
Пожаловаться на содержимое документа