Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ArKit #8

Open
wants to merge 6 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
34 changes: 27 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,25 +1,45 @@
# Bluetoothed ARKit 2.0 with ARWorldMap!
# ARKit 6.0

After Apple’s introduction of ARKit 2, we have been consistently working behind to create shared-AR experiences. Our goal is to improve the utility of mobile using AR experiences.
After Apple’s introduction of ARKit, we have been consistently working behind to create various AR experiences levraging the power of ARKit, RealityKit & SceneKit. Our goal is to improve the utility of mobile using AR experiences.

This demo created using ARKit 2:
This demo created using ARKit:
* Creates Geo-localized AR experiences on ARWorldMap
* Detects objects and images
* Mark specific objects and create 3D renders in point cloud
* Share information locally over BLE (Bluetooth low energy)
* Share information locally over maintaining the ARWorld Session
* Detecting user's sitting posture and providng info on it
* Detecting user's standing posture along with angles

## Features in this demo:
* Image tracking
* Face tracking
* Sitting posture tracking
* Standing posture tracking
* Save and load maps
* Detect objects
* Environmental texturing

### Prerequisites
Before we dive into the implementation details let’s take a look at the prerequisites of the course.

* Xcode 10 (beta or above)
* iOS 12 (beta or above)
* Physical iPhone 6S or above
* Latest Xcode
* Latest iOS version
* Physical iPhone device (Devices above X series is recommended for performance)

### Face Traking and loading live 3d content
Tracking and visualizing faces is a key feature to track user's face along with their expressions and simultaneuosly mimic the same user's expression using a 3D model, also there are many possible use cases of tracking face by honing the capability of ARKit

Here, in this tutiorial we have added some of the basic functionality of tracking face along with mimicking user's facial expression

### Body Tracking with angles detection
Body tracking is an essential feature of ARKit enabling to track a person in the physical environment and visualize their motion by applying the same body movements to a virtual character.
Alongside this we can also create our own model to mimic user's movemnets or also can use the "biped_robot" provided by Apple itself

In this demo we will detect 2 types of posture
1. Sitting posture
In this demo we detects the angle between the knee and spine joints as this demo is of sitting posture it mainly focus on user's sitting posture, According to certain reports when user sit's, there is certain pressure applied to spine joints so according to detected posture when user's sit with a reliable support with almost more than 90 degree angles and above which scientifically applies less pressure on spine joints the demo updates itself with the green shape and turns red if vice-versa
2. Standing posture
This demo is all about standing and detecting user's movement along with angles, In this demo i have created a skeleton using cylinder(Bones) and sphere(joints) which will mimic users movement also i have placed angle calculations at the joints based on calculation of 2 nearby joints. This usecase serves various purpose of body tracking and can be useful for exercise related applicaitons.

### Image recognition and tracking
“A photo is like a thousands words” - words are fine, but, ARKit-2 turns a photo into thousands of stories.
Expand Down
268 changes: 259 additions & 9 deletions iOS12_Sampler/ios12 Sampler.xcodeproj/project.pbxproj

Large diffs are not rendered by default.

48 changes: 48 additions & 0 deletions iOS12_Sampler/ios12 Sampler/AVDetailsVC.swift
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
//
// AVDetailsVC.swift
// ios12 Sampler
//
// Created by Dhruvil Vora on 29/02/24.
// Copyright © 2024 Testing. All rights reserved.
//

import UIKit

class AVDetailsVC: BaseCameraVC {

@IBOutlet weak var cameraView: UIView!

override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
cameraView.layer.addSublayer(prevLayer!)
}

override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
}

override func viewWillDisappear(_ animated: Bool) {
super.viewWillDisappear(animated)
}

@IBAction func btnLiveImageFilterClicked(_ sender: UIButton) {
let vc = self.storyboard?.instantiateViewController(withIdentifier: "ARImageDetectorVC") as? ARImageDetectorVC
self.navigationController?.pushViewController(vc!, animated: true)
}

@IBAction func btnSurfaceDetectionClicked(_ sender: UIButton) {
let vc = self.storyboard?.instantiateViewController(withIdentifier: "ARFaceDetection") as? ARFaceDetection
self.navigationController?.pushViewController(vc!, animated: true)
}

@IBAction func btnSittingPostureClicked(_ sender: UIButton) {
let vc = self.storyboard?.instantiateViewController(withIdentifier: "ARPostureDetection") as? ARPostureDetection
self.navigationController?.pushViewController(vc!, animated: true)
}

@IBAction func btnStandingPostureClicked(_ sender: UIButton) {
let vc = self.storyboard?.instantiateViewController(withIdentifier: "StandingPostureVC") as? StandingPostureVC
self.navigationController?.pushViewController(vc!, animated: true)
}
}
56 changes: 9 additions & 47 deletions iOS12_Sampler/ios12 Sampler/AVMainVC.swift
Original file line number Diff line number Diff line change
Expand Up @@ -9,37 +9,25 @@
import UIKit
import AVFoundation

class AVMainVC: UIViewController, AVCaptureMetadataOutputObjectsDelegate {
class AVMainVC: BaseCameraVC {

@IBOutlet weak var cameraView: UIView!

var session: AVCaptureSession?
var device: AVCaptureDevice?
var input: AVCaptureDeviceInput?
var output: AVCaptureMetadataOutput?
var prevLayer: AVCaptureVideoPreviewLayer?

@IBOutlet weak var CameraView: UIView!

override func viewDidLoad() {
super.viewDidLoad()
createSession()
// Do any additional setup after loading the view.
cameraView.layer.addSublayer(prevLayer!)
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
self.navigationController?.isNavigationBarHidden = true
}
override func viewWillDisappear(_ animated: Bool) {
super.viewWillDisappear(animated)
self.navigationController?.isNavigationBarHidden = false
}

@IBAction func btnActionWorldSharing(_ sender: Any) {
let vc = self.storyboard?.instantiateViewController(withIdentifier: "ARImageLocator") as? ARImageLocator
let vc = self.storyboard?.instantiateViewController(withIdentifier: "ARImageDetectorVC") as? ARImageDetectorVC
self.navigationController?.pushViewController(vc!, animated: true)
// let vc = self.storyboard?.instantiateViewController(withIdentifier: "ARSurfaceDetectionVC") as? ARSurfaceDetectionVC
// self.navigationController?.pushViewController(vc!, animated: true)
// let vc = self.storyboard?.instantiateViewController(withIdentifier: "AVSharingWorldMapVC") as? AVSharingWorldMapVC
// self.navigationController?.pushViewController(vc!, animated: true)
}

@IBAction func btnActionScanAndDetectObjects(_ sender: UIButton) {
Expand All @@ -54,35 +42,9 @@ class AVMainVC: UIViewController, AVCaptureMetadataOutputObjectsDelegate {
let vc = self.storyboard?.instantiateViewController(withIdentifier: "AVTextureEnvironment") as? AVTextureEnvironment
self.navigationController?.pushViewController(vc!, animated: true)
}
func createSession() {
session = AVCaptureSession()
device = AVCaptureDevice.default(for: .video)

var error: NSError? = nil
do {
if device != nil {
input = try AVCaptureDeviceInput(device: device!)
}
} catch {
print(error)
}

if error == nil {
if input != nil {
session?.addInput(input!)
}
} else {
print("camera input error: \(String(describing: error))")
}

prevLayer = AVCaptureVideoPreviewLayer(session: session!)
let del = UIApplication.shared.delegate as? AppDelegate
prevLayer?.frame = (del?.window?.frame)!
prevLayer?.videoGravity = AVLayerVideoGravity.resizeAspectFill
CameraView.layer.addSublayer(prevLayer!)
DispatchQueue.global().async {
self.session?.startRunning()
}
}

@IBAction func btnMoreClicked(_ sender: UIButton) {
let vc = self.storyboard?.instantiateViewController(withIdentifier: "AVDetailsVC") as? AVDetailsVC
self.navigationController?.pushViewController(vc!, animated: true)
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
{
"info" : {
"author" : "xcode",
"version" : 1
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
{
"images" : [
{
"idiom" : "universal",
"scale" : "1x"
},
{
"filename" : "Cyclops.png",
"idiom" : "universal",
"scale" : "2x"
},
{
"idiom" : "universal",
"scale" : "3x"
}
],
"info" : {
"author" : "xcode",
"version" : 1
}
}
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
{
"images" : [
{
"idiom" : "universal",
"scale" : "1x"
},
{
"filename" : "Glasses.png",
"idiom" : "universal",
"scale" : "2x"
},
{
"idiom" : "universal",
"scale" : "3x"
}
],
"info" : {
"author" : "xcode",
"version" : 1
}
}
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
{
"images" : [
{
"idiom" : "universal",
"scale" : "1x"
},
{
"filename" : "heart2.png",
"idiom" : "universal",
"scale" : "2x"
},
{
"idiom" : "universal",
"scale" : "3x"
}
],
"info" : {
"author" : "xcode",
"version" : 1
}
}
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
{
"images" : [
{
"idiom" : "universal",
"scale" : "1x"
},
{
"filename" : "Neon.png",
"idiom" : "universal",
"scale" : "2x"
},
{
"idiom" : "universal",
"scale" : "3x"
}
],
"info" : {
"author" : "xcode",
"version" : 1
}
}
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
{
"images" : [
{
"idiom" : "universal",
"scale" : "1x"
},
{
"filename" : "Robo.png",
"idiom" : "universal",
"scale" : "2x"
},
{
"idiom" : "universal",
"scale" : "3x"
}
],
"info" : {
"author" : "xcode",
"version" : 1
}
}
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
{
"images" : [
{
"idiom" : "universal",
"scale" : "1x"
},
{
"filename" : "Star.png",
"idiom" : "universal",
"scale" : "2x"
},
{
"idiom" : "universal",
"scale" : "3x"
}
],
"info" : {
"author" : "xcode",
"version" : 1
}
}
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
{
"images" : [
{
"idiom" : "universal",
"scale" : "1x"
},
{
"filename" : "Swag.png",
"idiom" : "universal",
"scale" : "2x"
},
{
"idiom" : "universal",
"scale" : "3x"
}
],
"info" : {
"author" : "xcode",
"version" : 1
}
}
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading