Daily Physical Activities Dataset

 

Daily Physical Activities Dataset

Dataset Information:

The sensors used for data acquisition consisted of three MTx 3-DOF inertial trackers developed by Xsens Technologies. Each MTx unit includes a tri-axial accelerometer measuring the acceleration in the 3-dspace (with a dynamic range of 75g where g represents the gravitational constant).The sensor's placements is chosen to represent the human body motion while guaranteeing less constraint and better comfort for the wearer as well as its security. The sensors were placed at the chest, the right thigh and the left ankle respectively as shown in Fig. 1. The points near the hip and torso exhibit a 6g range in acceleration.
Our experiences show also that the measured ankle-sensor accelerations during the different activities do not exceed the limit of +-5g. The sampling frequency is set to 25Hz, which is sufficient and larger than 20Hz the required frequency to assess daily physical activity. The sensors were fixed on the subjects with the help of an assistant before the beginning of the measurement operation. Sensors placement is chosen to represent predominantly upper-body activities such as standing up, sitting down, etc. and predominantly lower body activities such as walking, stair ascent, stair descent, etc. To secure each MTx unit in place, specific straps are used. This combination allows for efficient inter-subject transfer. The MTx units are connected to a central unit called Xbus Master which is attached to the subject's belt. Raw acceleration data are collected over time when performing the activities and the data transmission between units and the receiver is carried out through a Bluetooth wireless link. The experiments were conducted at the LISSI Lab/University of Paris-Est Créteil. (UPEC) by six different healthy subjects of different ages (who are not the researchers) in the office environment. In order to gather representative dataset, the recruited volunteer’s subjects have been chosen in a given marge of age (25–30) and weight (55–70) kg. Activity labels were estimated by an independent operator. Data are stored on a file and acceleration signals are analyzed using MATLAB software. Twelve activities and transitions were studied and are shown in Table1 and some of these activities are illustrated on Fig. 2.
The activities were chosen to have an appropriate representation of everyday activities involving different parts of the body. The recognized activities and transitions differ in duration and intensity level. The subjects were asked to perform the activities in their own style and were not restricted on how the activities should be performed but only with the sequential activities order. Note that the activities A3, A5, A7, A9 and A11 represent dynamic transitions between static activities. For the three sensor units, each unit being a tri-axial accelerometer, a 9-dimensional acceleration time series are recorded overtime for each activity. The time series present regime changes overtime, in which each regime is associated to an activity.

Figure 1. MTx-Xbus inertial tracker and sensors placement
Table 1. Description of the activities

For any questions and/or comments on this dataset, please contact us by filling the form below



















Gait Cycle Dataset

 

Gait Cycle Dataset

Dataset Information:

Walking is a form of mobility that represents the most important human physical activity due to its great impact on the majority of daily living activities. To understand better the human walking, it may be helpful to quantify gait dynamics accurately. This dataset represents a real-life benchmark in the area of gait cycle recognition applications. This dataset contains measures of gait from five healthy subjects with different profiles (mean age: 27 years old, mean weight: 79 kg). This dataset includes the vertical Ground Reaction Force (vGRF) records of subjects during the walk. Each subject wearing the in-shoe pressure sensors performed thirty gait cycles along a straight line. The participants were instructed to walk in their own style. In this experiment, the F-Scan system from Tekscan, an in-shoe pressure measurement system, is used to collect force and pressure measurement data wirelessly and without altering the natural movement of the subject. High-resolution, ultra-thin (.015mm) sensors are placed underneath the plantar surface of each foot to characterize the interaction between the foot and the ground. A wireless datalogger attached to the subject’s waist transmits pressure and force data from the in-shoe sensors in real time through a WiFi network to the host computer (as shown in Figure 1). The F-Scan sensor measures the subject’s plantar pressure and is composed of 960 individual pressure sensing locations, also called "sensels". The sensels are arranged in rows and columns on the sensor (as shown in Figure 1 (c)). The output of each sensel is divided into 256 increments with 8 Bit Digital Pressure and a resolution of 0.2N. The F-scan operates with a scan rate of up to 100 Hz.

Figure 1. In-shoe pressure measurement system

This dataset is fully labeled by analyzing the vertical Ground Reaction Force (vGRF) profiles provided by the in-shoe pressure sensors. The normal gait cycle involving six gait phases is considered in this dataset. The considered gait cycle phases are: Loading Response (S1); Mid-Stance (S2); Terminal Stance (S3); Pre-Swing (S4); Mid-Swing (S5); and Terminal Swing (S6). Figure 2 shows an example of gait-cycle phases manual labelling.

Figure 2. Example of manual data labelling using the vertical Ground Reaction Force (vGRF) measured on each foot.

Attribute Information:

vGRF of the center of force of each foot.
Coordinate location of the center of force of each foot.
Pressure mapping of each foot.

For any questions and/or comments on this dataset, please contact us by filling the form below



















Cerebro

 

CEREBRO

Hybrid platform for reasoning on events in dynamic and uncertain domains

Cerebro is a hybrid platform for high-level causal and temporal reasoning on events and their effects in dynamic and uncertain domains. Emphasis has been given in the seamless integration of a causal and epistemic rule-based event calculus system with multi-modal probabilistic inferencing, aiming at completing the loop of knowledge representation, reasoning and decision making in real-world settings. Cerebro reasoning mechanism relies on the Discrete Event Calculus Knowledge Theory (DECKT) and aims to transfer the benefits of causal and epistemic rule based event calculus, such as the solution to the frame problem for expressive classes of problems, into an efficient forward-chaining system that goes beyond ordinary rule-based systems deployed in dynamic domains, where the actions that lead to the assertion and retraction of facts have no real semantics and high-level structures.

The Cerebro platform's architecture is modular to enable the integration of a multitude of machine learning and reasoning approaches. It includes today a design tool for the specification of event calculus rules and a reasoning engine that performs inferences online.

Cerebro is integrated within the ubistruct living lab to control a variety of devices and robots, in order to promote the perception capacity of a ubiquitous assistive system in understanding occurring situations and to react effectively. To this end, Cerebro includes logical programs and Bayesian networks, which can recognize the events that compose activities of daily living in an indoor environment, monitor their proper execution, and perform assistive actions, in the form of recommendations, alerts or device handling, in order to facilitate user's domestic tasks.

References:

T. Patkos, D. Plexousakis, A. Chibani, and Y. Amirat, "An Event Calculus production rule system in dynamic and uncertain domains," Journal of Theory and Practice of Logic Programming, Cambridge University Press, vol. 16, no. 3, pp. 325-352, 2016. .

A. Chibani, A. Bikakis, T. Patkos, Y. Amirat, S. Bouznad, N. Ayari, and L. Sabri, "Using Cognitive Ubiquitous Robots for Assisting Dependent People in Smart Spaces," in Intelligent Assistive Robots- Recent advances in assistive robotics for everyday activities, S. Mohammed and J. C. Moreno and K. Kong and Y. Amirat Eds, Springer Tracts on Advanced Robotics (STAR) series, 2015, pp. 297-316. .

T. Patkos, A. Chibani, D. Plexousakis, and Y. Amirat, "A Production Rule-based Framework for Causal and Epistemic Reasoning," in Proc. Of the RuleML Symposium held in conjunction with ECAI 2012, the 20th biennial European Conference on Artificial Intelligence, Montpellier, France, 2012, pp. 120-135. .

B. Hu, T. Patkos, A. Chibani, and Y. Amirat, "Rule-Based Context Assessment in Smart Cities," in Proc. Of the 6th International Conference on Web Reasoning and Rule Systems, RR 2012, Vienna, Austria, 2012, pp. 221-224. .

B. Hu, A. Chibani, and Y. Amirat, "Semantic context relevance assessment in urban ubiquitous environments," in Proc. Of the 14th International conference on Ubiquitous Computing UbiComp'12, Pittsburgh, United States, 2012, pp. 639-640. .

Ubistruct Middleware

 

Ubistruct – UBiquitous Intelligence infraSTRUCTture

Living lab approach

Ubistruct is a service-oriented middleware for the internet/web of things. The architecture of the middleware is structured around a set abstract functionalities that simplify the registration, the discovery, the selection and orchestration of real world objects and services, which evolve in the cyber-physical ecosystem. The communication with and between the objects and services is handled through direct invocation or through publish subscribe. The latter are simplified for the programmer by using simple key words (Do, GET, Publish, Subscribe-To). The implementation is based on Java and Communication libraries such as HTTP, XMPP, JMS and Sockets.

Several services have been implemented by using the middleware API to enable context awareness in an ambient intelligence environment. For instance, indoor detection and localization of users are implemented by using Infra-red, RFID or iBeacon technology. Users activities services are detected by using Door and Power sensors.

References:

S. Bouznad, A. Chibani, Y. Amirat, L. Sabri, E. Prestes, F. Sebbak, and S. Fiorini, "Context-Aware Monitoring Agents for Ambient Assisted Living Applications," in Proc. Of the 13th European Conference on Ambient Intelligence, AmI 2017, Malaga, Spain, 2017, pp. 225-240. .

N. Ayari, A. Chibani, Y. Amirat, and E. Matson, "A Semantic Approach for Enhancing Assistive Services in ubiquitous robotics," Robotics and Autonomous Systems, Elsevier, vol. 75, pp. 17-27, 2016. .

A. Chibani, A. Bikakis, T. Patkos, Y. Amirat, S. Bouznad, N. Ayari, and L. Sabri, "Using Cognitive Ubiquitous Robots for Assisting Dependent People in Smart Spaces," in Intelligent Assistive Robots- Recent advances in assistive robotics for everyday activities, S. Mohammed and J. C. Moreno and K. Kong and Y. Amirat Eds, Springer Tracts on Advanced Robotics (STAR) series, 2015, pp. 297-316. .

A. Chibani, and Y. Amirat, "QoS driven context awareness using semantic sensors infrastructure," in Quality of Service Mechanisms in Next Generation Heterogeneous Networks: Utopia or Reality?, A. Mellouk Ed, ISTE-John Wiley & Sons, 2008, pp. 407-430. .

M. A. Gomez, A. Chibani, Y. Amirat, and E. Matson, "IoRT cloud survivability framework for robotic AALs using HARMS," Robotics and Autonomous Systems, Elsevier, vol. 106, pp. 192-206, 2018. .
N. Temglit, A. Chibani, K. Djouani, and M. A. Nacer, "A Distributed Agent-Based Approach for Optimal QoS Selection in Web of Object Choreography," IEEE Systems Journal, vol. 12, no. 2, pp. 1655 -1666, 2018. .

A. Yachir, Y. Amirat, A. Chibani, and N. Badache, "Service-Oriented, User-Centered and Event-Aware Framework for Ambient Intelligence and Internet of Things," IEEE Transactions on Automation Science and Engineering, vol. 13, no. 1, pp. 85-102, 2016. .

M. S. Khanouche, Y. Amirat, A. Chibani, M. Kerkar, and A. Yachir, "Energy-centered and QoS-aware services selection for Internet of Things," IEEE Transactions on Automation Science and Engineering, vol. 13, no. 3, pp. 1256-1269, 2016. .

A. Yachir, Y. Amirat, A. Chibani, and N. Badache, "Towards an Event-Aware Approach for Ubiquitous Computing based on Automatic Service Composition and Selection," Annals of Telecommunications, Springer, vol. 67, no. 7-8, pp. 341-353, 2012. .

K. Tari, Y. Amirat, A. Chibani, A. Yachir, and A. Mellouk, "Context-aware Dynamic Service Composition in Ubiquitous Environment," in Proc. Of the IEEE International Conference on Communications (ICC), Cape Town, South Africa, 2010, pp. 1-5. .

A. Yachir, Y. Amirat, K. Tari, and A. Chibani, "QoS Based Framework for Ubiquitous Robotic Services Composition," in Proc. Of the IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2009, St. Louis, United States, 2009, pp. 2019 --2026. .

A. Yachir, K. Tari, A. Chibani, and Y. Amirat, "Toward an Automatic Approach for Ubiquitous Robotic Services Composition," in Proc. Of the IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2008, Nice, France, 2008, pp. 3717-3724. .

Prototype EROWA

 

E-ROWA (Exoskeletal-Robotic Orthotics for Walking Assistance)

E-ROWA, shown in the figure below, developed by SG Mechatronics Co., is designed to assist dependent people to regain locomotion functions. This exoskeleton consists of four major parts: exoskeletal frames, sensors, hardware controller, and a power unit. The exoskeletal frames consist of ten degrees of freedom (DOFs) and are attached to the wearer’s waist and legs by means of straps. For each limb, there are three DOFs at each of the hip joints (i.e., flexion/extension, adduction/abduction and rotation), one DOF at each of the knee joints, and three DOFs at each of the ankle joints. The exoskeleton hip and knee joints of each limb are actuated in the sagittal plane using two compact Rotary Series Elastic Actuators (cRSEA), respectively, while the remaining DOFs are passive. The maximum output torque of each cRSEA is about 25 Nm, and the maximum rotary speed of the knee joint is 3.2 rad/s . The hip, knee, and ankle joints are also equipped each with 13 bits absolute encoders. E-ROWA is also equipped with Smart Shoes where the GRFs are measured using four force-sensing resister (FSR) sensors that are embedded at four locations of the insole. An NI myRIO board is used for data acquisition and control of the cRSEA actuators.

Lower limb exoskeleton E-ROWA

References:
[1] W. Huo, S. Mohammed, Y. Amirat, K. Kong, "Fast Gait Mode Detection and Assistive Torque Control of an Exoskeletal Robotic Orthosis for Walking Assistance (E-ROWA)", IEEE Transactions on Robotics, DOI <10.1109/TRO.2018. 2830367>, pp. 1-18, 2018.

[2] W. Huo, S. Mohammed, Y. Amirat, K. Kong, "Active Impedance Control of a Lower Limb Exoskeleton to Assist Sit-to-Stand Movement," in Proc. Of the IEEE International Conference on Robotics and Automation, ICRA 2016, Stockholm, Sweden, 2016, pp. 3530-3536.

Prototype EICOSI

 

EICOSI (Exoskeleton Intelligently COmmunicating and Sensitive to Intention)

EICOSI is a single degree-of-freedom (DoF) exoskeleton prototype designed to provide power assistance at the wearer’s knee joint level (Figure below). The exoskeleton consists of two segments attached separately to the thigh and shank and fixed to the wearer’s lower limb using appropriate braces. It is driven by a high-power brushless DC motor (Maxon, Switzerland). To obtain a compact and portable structure as well as relatively high output torque, a compact transmission system is designed using a gear motor, a ball screw, a belt transmission and a cable drive. The reduction ratio from the motor side to the joint side is 264:1. The whole actuator can deliver up to approximately 18 Nm. The motor is equipped with an incremental encoder that measures the motor’s rotation angle with a resolution of 1000 pulses per revolution. The knee joint angle can be calculated based on the reduction ratio of the whole transmission system. The angular velocity is obtained by a simple derivation of the joint angle. The EICOSI is controlled using a host PC equipped with a controller board (dSPACE, Germany) running at 1 kHz. The control programs are developed using MATLAB (Mathworks, USA).

References:
[1] W. Huo, S. Mohammed, Y. Amirat, "Impedance Reduction Control of a Knee Joint Human Exoskeleton System”, IEEE Transactions on Control Systems Technology, DOI: 10.1109/TCST.2018.2865768, 2018.

[2] H. Rifai, S. Mohammed, K. Djouani, Y. Amirat, "Towards Lower Limbs Functional Rehabilitation through a Knee Joint Exoskeleton", IEEE Transactions on Control Systems Technology, vol. 25, no. 2, pp. 712-719, 2017.

[3] H. Rifai, M.-S. B. Abdessalem, A. Chemori, S. Mohammed, Y. Amirat, "Augmented L1 Adaptive Control of an Actuated Knee Joint Exoskeleton: From Design to Real-Time Experiments," in Proc. Of the IEEE International Conference on Robotics and Automation, ICRA 2016, Stockholm, Sweden, pp. 5708-5714, 2016.