Automatic Detection of Chewing and Swallowing.
Akihiro NakamuraTakato SaitoDaizo IkedaKen OhtaHiroshi MinenoMasafumi NishimuraPublished in: Sensors (Basel, Switzerland) (2021)
A series of eating behaviors, including chewing and swallowing, is considered to be crucial to the maintenance of good health. However, most such behaviors occur within the human body, and highly invasive methods such as X-rays and fiberscopes must be utilized to collect accurate behavioral data. A simpler method of measurement is needed in healthcare and medical fields; hence, the present study concerns the development of a method to automatically recognize a series of eating behaviors from the sounds produced during eating. The automatic detection of left chewing, right chewing, front biting, and swallowing was tested through the deployment of the hybrid CTC/attention model, which uses sound recorded through 2ch microphones under the ear and weak labeled data as training data to detect the balance of chewing and swallowing. N-gram based data augmentation was first performed using weak labeled data to generate many weak labeled eating sounds to augment the training data. The detection performance was improved through the use of the hybrid CTC/attention model, which can learn the context. In addition, the study confirmed a similar detection performance for open and closed foods.
Keyphrases
- healthcare
- electronic health record
- big data
- physical activity
- weight loss
- endothelial cells
- real time pcr
- deep learning
- mental health
- risk assessment
- label free
- circulating tumor cells
- pet imaging
- minimally invasive
- room temperature
- ionic liquid
- gram negative
- soft tissue
- health insurance
- quantum dots
- positron emission tomography