官术网_书友最值得收藏!

Recognizing human motion using KNN

Core Motion is an iOS framework that provides an API for inertial sensors of mobile devices. It also recognizes some user motion types, and stores them to the HealthKit database.

If you are not familiar with Core Motion API, please check the framework reference: https://developer.apple.com/reference/coremotion.

The code for this example can be found in the  Code/02DistanceBased/ MotionClassification folder of supplementary materials.

As per iOS 11 beta 2, the CMMotionActivity class includes the following types of motion:

  • Stationary
  • Walking
  • Running
  • Automotive
  • Cycling

Everything else falls into an unknown category or is recognized as one of the preceding. Core Motion doesn't provide a way to recognize custom motion types so we'll train our own classifier for this purpose. Unlike decision trees from the previous chapter, KNN will be trained on device end-to-end. It will also not be frozen inside Core ML because as we keep all the control on it, we'll be able to update it in the application runtime.

iOS devices have three types of motion sensors:

  • Gyroscope: This measures device orientation in space
  • Accelerometer: This measures device acceleration
  • Magnetometer or compass: This measures magnetism

They also have a barometer to detect elevation and some other sensors, but they are less relevant for our purposes. We will use an accelerometer data stream to train our KNN classifier and predict different motion types, like shaking a phone or squatting.

The following listing shows how to get updates from the accelerometer:

let manager = CMMotionManager() 
manager.accelerometerUpdateInterval = 0.1 
manager.startAccelerometerUpdates(to: OperationQueue.main) { (data: CMAccelerometerData?, error: Error?) in 
    if let acceleration = data?.acceleration { 
        print(acceleration.x, acceleration.y, acceleration.z) 
    } 
} 

The accelerometer APIs in Core Motion provide a time series of three-dimensional vectors, as shown in the following diagram:

Figure 3.7: Core Motion coordinate system for accelerometer and gyroscope

To train our classifier, we need some labeled data. As we don't have a ready dataset and motion signals can be very different from person to person, we are going to allow the user to add new samples and improve the model. In the interface, the user selects the type of motion he wants to record, and presses the Record button, as shown in the next screenshot. The application samples 25 acceleration vectors, takes the magnitude of each vector, and feeds them with the label of the selected motion type into the KNN classifier. The user records as many samples as he wants.

主站蜘蛛池模板: 饶阳县| 岱山县| 万州区| 卢龙县| 高阳县| 开江县| 普洱| 都兰县| 城口县| 朝阳市| 辉县市| 昭平县| 吕梁市| 博爱县| 皮山县| 临沭县| 甘肃省| 吴桥县| 水富县| 任丘市| 和平县| 岳阳市| 平果县| 三原县| 千阳县| 甘孜县| 登封市| 文成县| 泰宁县| 荥阳市| 岳阳县| 莱芜市| 丹寨县| 金阳县| 古丈县| 阜阳市| 山东省| 灌南县| 呼图壁县| 延安市| 永泰县|