Smart-Toy and Smart-Bedsheet Dataset Based on Pressure Mapping Smart Textile

Published: 17 March 2022| Version 1 | DOI: 10.17632/7cvh5wmx8p.1
Contributors:
,
,
,
,
,
,

Description

For details about the two datasets, please read ''introduction_to_the_two_datasets.pdf". Based on the pressure mapping smart textile, we have developed two applications: Smart-Toy and Smart-Bedsheet, as shown in Fig. 1 and Fig. 3. For the Smart-Toy, we were interested in common interactions with this plush toy in life (20 actions in total as shown in Fig. 2) and invited 10 participants (5 females and 5 males, aged 23∼35) to create the Smart-Toy dataset, the details of which are presented in section 2. For the SmartBedsheet, we used it to identify common sleeping postures (11 postures in total as shown in Fig. 4), and we invited 13 participants (4 females and 9 males, the weight ranges from 42kg to 110kg, and the height ranges from 155cm to 188cm) to create the Smart-Bedsheet dataset, the details of which will be introduced in section 3. Welcome to visit our website: http://pplab.ustc.edu.cn/. If you have any questions about the two datasets, please feel free to contact us (gtpplab@mail.ustc.edu.cn) Funding Our work is supported ”the National Natural Science Foundation of China” (Grant No.62072420) and ”the Fundamental Research Funds for the Central Universities” (Grant No.2150110020). 1. Smart-Toy dataset A Smart-Toy was created as a 3D application. The majority of the toy’s surface is covered with a 23 × 16 textile sensor matrix, with a spatial resolution of 1.5cm × 1.5cm, as shown in Fig. 1b. The pressure sensitive textile is sewn onto the inner side of the plush toy’s skin, flipped, and folded together with the skin into a cuboid. The pressure distribution is digitalized by on-chip 12-bits ADCs in the dsPIC33F and transmitted wirelessly via Bluetooth to a smart phone at 41 frames/second. Ten healthy adults (5 females and 5 males, aged 23∼35) were invited to create the play interaction dataset. They followed the instructions from the computer and carried out 20 play interactions (as shown in Fig. 2) in a random order (one session). Each participant finished 10 sessions. We manually label each instance by observing video recordings and pressure distribution changes. 2. Smart-Bedsheet dataset A Smart-Bedsheet is covered with a 56 × 40 textile sensor matrix on the surface, with a spatial resolution of 3.2cm × 2.3cm, as shown in Fig. 3. The pressure distribution is digitalized by on-chip 12-bits ADCs in the STM32F303ZET6 and transmitted via USB to a laptop at 55 frames/second. Thirteen healthy adults (4 females and 9 males, the weight ranges from 42kg to 110kg, and the height ranges from 155cm to 188cm) were invited. They followed the instructions from the organizer, completed 11 sleeping postures (as shown in Fig. 4) in a random order, and each posture lasted for about 5 seconds. Each participant finished 10 sessions. Although each posture lasted 5 seconds in each session, these frames are extremely similar, so we propose to take a random frame as the instance of this posture.

Files

Steps to reproduce

For details about the two datasets, please read ''introduction_to_the_two_datasets.pdf". 1. Smart-Toy dataset 1.1. Data files The Smart-Toy dataset contains a total of 10 participants (from ”a” to ”j”). For each participant, there is a file named ”data stream.npy”, which is the matrix data directly saved to the disk by the ”numpy.save()” function, the size is (n rows=23, n cols=16, n frames), that is, the first two dimensions are the number of rows and columns of the matrix, and the last dimension is the total number of frames. About label: 1) label.csv: This file is generated by manually marking the start and end frames of each interaction from the data stream. Each label file corresponds to a data file. Each line represents a mark, the first is the name of the interaction, the second is the index of the start-frame, and the third is the index of the end-frame. 2) win label.csv: This file is generated by manually marking the start and end frames of each interaction from the data stream, and then automatically divide it according to the window size (20 frames). Each label file corresponds to a data file. Each line represents a mark, the first is the name of the interaction, the second is the index of the start-frame, and the third is the index of the end-frame. 1.2. Example code In the file ”example_code_smart_toy.py” in this directory, we show how to read the data and labels from the Smart-Toy dataset. 2. Smart-Bedsheet dataset 2.1. Data files The Smart-Bedsheet dataset contains a total of 13 participants (numbered from 1 to 13). Each participant completed 11 sleeping postures (as shown in Fig. 4) in a random order and finished 10 sessions. The data of session-k were stored in the file ”dataset_k.csv”, and the labels were stored in the file ”label k.csv”: 1) dataset_k.csv: Each row represents a frame and has 2242 numbers. The first number is the timestamp of the lower computer (from the start of the lower computer, the unit is 0.1ms), and the second number is the timestamp of the upper computer (from 1970/1/1, the unit is 1ms). And the following 2240 (56 rows and 40 columns) numbers are the data of the bedsheet (we turn the two-dimensional data into one row in row-first order). 2) label_k.csv: Each line represents a mark, the first is the name of the posture, the second is the index of the start-frame, and the third is the index of the end-frame. 2.2. Example code In the file ”example_code_smart_bedsheet.py” in this directory, we show how to read the data and labels from the Smart-Bedsheet dataset.

Institutions

University of Science and Technology of China

Categories

Activity Recognition, Pervasive Computing

Licence