diff --git a/README.md b/README.md index 7be4cfe..b4f149e 100644 --- a/README.md +++ b/README.md @@ -1,15 +1,20 @@ -# Bluetoothed ARKit 2.0 with ARWorldMap! +# ARKit 6.0 -After Apple’s introduction of ARKit 2, we have been consistently working behind to create shared-AR experiences. Our goal is to improve the utility of mobile using AR experiences. +After Apple’s introduction of ARKit, we have been consistently working behind to create various AR experiences levraging the power of ARKit, RealityKit & SceneKit. Our goal is to improve the utility of mobile using AR experiences. -This demo created using ARKit 2: +This demo created using ARKit: * Creates Geo-localized AR experiences on ARWorldMap * Detects objects and images * Mark specific objects and create 3D renders in point cloud -* Share information locally over BLE (Bluetooth low energy) +* Share information locally over maintaining the ARWorld Session +* Detecting user's sitting posture and providng info on it +* Detecting user's standing posture along with angles ## Features in this demo: * Image tracking +* Face tracking +* Sitting posture tracking +* Standing posture tracking * Save and load maps * Detect objects * Environmental texturing @@ -17,9 +22,24 @@ This demo created using ARKit 2: ### Prerequisites Before we dive into the implementation details let’s take a look at the prerequisites of the course. -* Xcode 10 (beta or above) -* iOS 12 (beta or above) -* Physical iPhone 6S or above +* Latest Xcode +* Latest iOS version +* Physical iPhone device (Devices above X series is recommended for performance) + +### Face Traking and loading live 3d content +Tracking and visualizing faces is a key feature to track user's face along with their expressions and simultaneuosly mimic the same user's expression using a 3D model, also there are many possible use cases of tracking face by honing the capability of ARKit + +Here, in this tutiorial we have added some of the basic functionality of tracking face along with mimicking user's facial expression + +### Body Tracking with angles detection +Body tracking is an essential feature of ARKit enabling to track a person in the physical environment and visualize their motion by applying the same body movements to a virtual character. +Alongside this we can also create our own model to mimic user's movemnets or also can use the "biped_robot" provided by Apple itself + +In this demo we will detect 2 types of posture +1. Sitting posture +In this demo we detects the angle between the knee and spine joints as this demo is of sitting posture it mainly focus on user's sitting posture, According to certain reports when user sit's, there is certain pressure applied to spine joints so according to detected posture when user's sit with a reliable support with almost more than 90 degree angles and above which scientifically applies less pressure on spine joints the demo updates itself with the green shape and turns red if vice-versa +2. Standing posture +This demo is all about standing and detecting user's movement along with angles, In this demo i have created a skeleton using cylinder(Bones) and sphere(joints) which will mimic users movement also i have placed angle calculations at the joints based on calculation of 2 nearby joints. This usecase serves various purpose of body tracking and can be useful for exercise related applicaitons. ### Image recognition and tracking “A photo is like a thousands words” - words are fine, but, ARKit-2 turns a photo into thousands of stories. diff --git a/iOS12_Sampler/ios12 Sampler.xcodeproj/project.pbxproj b/iOS12_Sampler/ios12 Sampler.xcodeproj/project.pbxproj index 63d181d..2d5c42b 100644 --- a/iOS12_Sampler/ios12 Sampler.xcodeproj/project.pbxproj +++ b/iOS12_Sampler/ios12 Sampler.xcodeproj/project.pbxproj @@ -20,6 +20,15 @@ 2E18E1572BB2AC9300D4C1E3 /* BodySkeleton+Entity.swift in Sources */ = {isa = PBXBuildFile; fileRef = 2E18E1542BB2AC9300D4C1E3 /* BodySkeleton+Entity.swift */; }; 2E18E1592BB2AD2800D4C1E3 /* String+Extension.swift in Sources */ = {isa = PBXBuildFile; fileRef = 2E18E1582BB2AD2800D4C1E3 /* String+Extension.swift */; }; 2E18E15B2BB2BB1500D4C1E3 /* StandingPostureVC.swift in Sources */ = {isa = PBXBuildFile; fileRef = 2E18E15A2BB2BB1500D4C1E3 /* StandingPostureVC.swift */; }; + 2E2C35C42BE215F100200E7E /* ARFaceDetection.swift in Sources */ = {isa = PBXBuildFile; fileRef = 2E2C35C32BE215F100200E7E /* ARFaceDetection.swift */; }; + 2E2C35C62BE216FC00200E7E /* ModelCollectionCell.swift in Sources */ = {isa = PBXBuildFile; fileRef = 2E2C35C52BE216FC00200E7E /* ModelCollectionCell.swift */; }; + 2E41DFD12BE2206F0012F773 /* Heart.usdz in Resources */ = {isa = PBXBuildFile; fileRef = 2E41DFCA2BE2206F0012F773 /* Heart.usdz */; }; + 2E41DFD22BE2206F0012F773 /* Glasses.usdz in Resources */ = {isa = PBXBuildFile; fileRef = 2E41DFCB2BE2206F0012F773 /* Glasses.usdz */; }; + 2E41DFD32BE2206F0012F773 /* Cyclops.usdz in Resources */ = {isa = PBXBuildFile; fileRef = 2E41DFCC2BE2206F0012F773 /* Cyclops.usdz */; }; + 2E41DFD42BE2206F0012F773 /* Neon.usdz in Resources */ = {isa = PBXBuildFile; fileRef = 2E41DFCD2BE2206F0012F773 /* Neon.usdz */; }; + 2E41DFD52BE2206F0012F773 /* Star.usdz in Resources */ = {isa = PBXBuildFile; fileRef = 2E41DFCE2BE2206F0012F773 /* Star.usdz */; }; + 2E41DFD62BE2206F0012F773 /* Swag.usdz in Resources */ = {isa = PBXBuildFile; fileRef = 2E41DFCF2BE2206F0012F773 /* Swag.usdz */; }; + 2E41DFD72BE2206F0012F773 /* Animoji.usdz in Resources */ = {isa = PBXBuildFile; fileRef = 2E41DFD02BE2206F0012F773 /* Animoji.usdz */; }; 2E70214B2B8EFC4000089680 /* ARPostureDetection.swift in Sources */ = {isa = PBXBuildFile; fileRef = 2E70214A2B8EFC4000089680 /* ARPostureDetection.swift */; }; 2E70214F2B90A0D900089680 /* BaseCameraVC.swift in Sources */ = {isa = PBXBuildFile; fileRef = 2E70214E2B90A0D900089680 /* BaseCameraVC.swift */; }; 2E7021512B90A42700089680 /* AVDetailsVC.swift in Sources */ = {isa = PBXBuildFile; fileRef = 2E7021502B90A42700089680 /* AVDetailsVC.swift */; }; @@ -31,6 +40,7 @@ 2E7503282B30640100DF78E1 /* ARSurfaceDetectionVC.swift in Sources */ = {isa = PBXBuildFile; fileRef = 2E7503272B30640100DF78E1 /* ARSurfaceDetectionVC.swift */; }; 2E75032B2B3079FB00DF78E1 /* Plane.swift in Sources */ = {isa = PBXBuildFile; fileRef = 2E75032A2B3079FB00DF78E1 /* Plane.swift */; }; 2E75032D2B30913E00DF78E1 /* Utility.swift in Sources */ = {isa = PBXBuildFile; fileRef = 2E75032C2B30913E00DF78E1 /* Utility.swift */; }; + 2E7D4AFC2BBADA0600DE0004 /* Metal_Round_Glasses.usdz in Resources */ = {isa = PBXBuildFile; fileRef = 2E7D4AF12BBADA0600DE0004 /* Metal_Round_Glasses.usdz */; }; 2E89AB4A2B46DBD8005EB695 /* ImageTrackingUtility.swift in Sources */ = {isa = PBXBuildFile; fileRef = 2E89AB492B46DBD8005EB695 /* ImageTrackingUtility.swift */; }; 2E89AB4C2B47E11C005EB695 /* VisuallizationNode.swift in Sources */ = {isa = PBXBuildFile; fileRef = 2E89AB4B2B47E11C005EB695 /* VisuallizationNode.swift */; }; 2E89AB4F2B481EAA005EB695 /* FilterCell.swift in Sources */ = {isa = PBXBuildFile; fileRef = 2E89AB4E2B481EAA005EB695 /* FilterCell.swift */; }; @@ -117,6 +127,15 @@ 2E18E1542BB2AC9300D4C1E3 /* BodySkeleton+Entity.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; name = "BodySkeleton+Entity.swift"; path = "ios12 Sampler/Body Detection with AR/Standing Posture/SkeletonHelper/BodySkeleton+Entity.swift"; sourceTree = SOURCE_ROOT; }; 2E18E1582BB2AD2800D4C1E3 /* String+Extension.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = "String+Extension.swift"; sourceTree = ""; }; 2E18E15A2BB2BB1500D4C1E3 /* StandingPostureVC.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = StandingPostureVC.swift; sourceTree = ""; }; + 2E2C35C32BE215F100200E7E /* ARFaceDetection.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = ARFaceDetection.swift; sourceTree = ""; }; + 2E2C35C52BE216FC00200E7E /* ModelCollectionCell.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = ModelCollectionCell.swift; sourceTree = ""; }; + 2E41DFCA2BE2206F0012F773 /* Heart.usdz */ = {isa = PBXFileReference; lastKnownFileType = file.usdz; path = Heart.usdz; sourceTree = ""; }; + 2E41DFCB2BE2206F0012F773 /* Glasses.usdz */ = {isa = PBXFileReference; lastKnownFileType = file.usdz; path = Glasses.usdz; sourceTree = ""; }; + 2E41DFCC2BE2206F0012F773 /* Cyclops.usdz */ = {isa = PBXFileReference; lastKnownFileType = file.usdz; path = Cyclops.usdz; sourceTree = ""; }; + 2E41DFCD2BE2206F0012F773 /* Neon.usdz */ = {isa = PBXFileReference; lastKnownFileType = file.usdz; path = Neon.usdz; sourceTree = ""; }; + 2E41DFCE2BE2206F0012F773 /* Star.usdz */ = {isa = PBXFileReference; lastKnownFileType = file.usdz; path = Star.usdz; sourceTree = ""; }; + 2E41DFCF2BE2206F0012F773 /* Swag.usdz */ = {isa = PBXFileReference; lastKnownFileType = file.usdz; path = Swag.usdz; sourceTree = ""; }; + 2E41DFD02BE2206F0012F773 /* Animoji.usdz */ = {isa = PBXFileReference; lastKnownFileType = file.usdz; path = Animoji.usdz; sourceTree = ""; }; 2E70214A2B8EFC4000089680 /* ARPostureDetection.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = ARPostureDetection.swift; sourceTree = ""; }; 2E70214E2B90A0D900089680 /* BaseCameraVC.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = BaseCameraVC.swift; sourceTree = ""; }; 2E7021502B90A42700089680 /* AVDetailsVC.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = AVDetailsVC.swift; sourceTree = ""; }; @@ -128,6 +147,7 @@ 2E7503272B30640100DF78E1 /* ARSurfaceDetectionVC.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = ARSurfaceDetectionVC.swift; sourceTree = ""; }; 2E75032A2B3079FB00DF78E1 /* Plane.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = Plane.swift; sourceTree = ""; }; 2E75032C2B30913E00DF78E1 /* Utility.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = Utility.swift; sourceTree = ""; }; + 2E7D4AF12BBADA0600DE0004 /* Metal_Round_Glasses.usdz */ = {isa = PBXFileReference; lastKnownFileType = file.usdz; path = Metal_Round_Glasses.usdz; sourceTree = ""; }; 2E89AB492B46DBD8005EB695 /* ImageTrackingUtility.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = ImageTrackingUtility.swift; sourceTree = ""; }; 2E89AB4B2B47E11C005EB695 /* VisuallizationNode.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = VisuallizationNode.swift; sourceTree = ""; }; 2E89AB4E2B481EAA005EB695 /* FilterCell.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = FilterCell.swift; sourceTree = ""; }; @@ -281,6 +301,86 @@ path = SkeletonHelper; sourceTree = ""; }; + 2E2C35C22BE215CB00200E7E /* Face Detection with AR */ = { + isa = PBXGroup; + children = ( + 2E2C35C32BE215F100200E7E /* ARFaceDetection.swift */, + 2E2C35C72BE2170100200E7E /* Cell */, + ); + path = "Face Detection with AR"; + sourceTree = ""; + }; + 2E2C35C72BE2170100200E7E /* Cell */ = { + isa = PBXGroup; + children = ( + 2E2C35C52BE216FC00200E7E /* ModelCollectionCell.swift */, + ); + path = Cell; + sourceTree = ""; + }; + 2E41DFC92BE220250012F773 /* Face Models */ = { + isa = PBXGroup; + children = ( + 2E7D4AF12BBADA0600DE0004 /* Metal_Round_Glasses.usdz */, + 2E41DFD02BE2206F0012F773 /* Animoji.usdz */, + 2E41DFCC2BE2206F0012F773 /* Cyclops.usdz */, + 2E41DFCB2BE2206F0012F773 /* Glasses.usdz */, + 2E41DFCA2BE2206F0012F773 /* Heart.usdz */, + 2E41DFCD2BE2206F0012F773 /* Neon.usdz */, + 2E41DFCE2BE2206F0012F773 /* Star.usdz */, + 2E41DFCF2BE2206F0012F773 /* Swag.usdz */, + ); + path = "Face Models"; + sourceTree = ""; + }; + 2E7021492B8EFBD500089680 /* Body Detection with AR */ = { + isa = PBXGroup; + children = ( + 2E70214A2B8EFC4000089680 /* ARPostureDetection.swift */, + 2E18E1452BB2A70900D4C1E3 /* ARPostureDetection+ARSessionDelegate.swift */, + ); + path = "Sitting Posture"; + sourceTree = ""; + }; + 2E18E1472BB2A7AB00D4C1E3 /* Standing Posture */ = { + isa = PBXGroup; + children = ( + 2E18E15A2BB2BB1500D4C1E3 /* StandingPostureVC.swift */, + 2E18E15C2BB2BB3500D4C1E3 /* SkeletonHelper */, + 2E18E1512BB2ABE300D4C1E3 /* Skeleton */, + ); + path = "Standing Posture"; + sourceTree = ""; + }; + 2E18E1482BB2A93B00D4C1E3 /* Enums */ = { + isa = PBXGroup; + children = ( + 2E18E1492BB2A94F00D4C1E3 /* Bones.swift */, + 2E18E14A2BB2A94F00D4C1E3 /* JointAngles.swift */, + ); + path = Enums; + sourceTree = ""; + }; + 2E18E1512BB2ABE300D4C1E3 /* Skeleton */ = { + isa = PBXGroup; + children = ( + 2E18E14E2BB2AAEE00D4C1E3 /* SkeletonBone.swift */, + 2E18E14D2BB2AAEE00D4C1E3 /* SkeletonJoint.swift */, + 2E18E1482BB2A93B00D4C1E3 /* Enums */, + ); + path = Skeleton; + sourceTree = ""; + }; + 2E18E15C2BB2BB3500D4C1E3 /* SkeletonHelper */ = { + isa = PBXGroup; + children = ( + 2E18E1522BB2AC9300D4C1E3 /* BodySkeleton.swift */, + 2E18E1532BB2AC9300D4C1E3 /* BodySkeleton+AngleDetection.swift */, + 2E18E1542BB2AC9300D4C1E3 /* BodySkeleton+Entity.swift */, + ); + path = SkeletonHelper; + sourceTree = ""; + }; 2E7021492B8EFBD500089680 /* Body Detection with AR */ = { isa = PBXGroup; children = ( @@ -351,6 +451,7 @@ 2E962C6C2B67A10400D0903D /* USDZ */ = { isa = PBXGroup; children = ( + 2E41DFC92BE220250012F773 /* Face Models */, 2E962C8E2B69017200D0903D /* robot_walk_idle.usdz */, 2E90B3EC2B87636800D3DB90 /* biped_robot.usdz */, ); @@ -457,6 +558,7 @@ 4931C78C20D3988E002F907B /* AVTextureEnvironment.swift */, 8127E0E220D7E5B500D8CD7F /* AVImageDetaction.swift */, 2E7021492B8EFBD500089680 /* Body Detection with AR */, + 2E2C35C22BE215CB00200E7E /* Face Detection with AR */, 2E04C9032B4EE267000B4936 /* Detecting Images in AR */, 2E75034B2B31E00400DF78E1 /* Tracking and altering images */, 2E7503222B30227F00DF78E1 /* Surface Detection */, @@ -627,11 +729,19 @@ 8127E0E820D7E6DC00D8CD7F /* giphy.gif in Resources */, 4969EA8F20D109FC00F8AE9E /* art.scnassets in Resources */, 4991D8F520D907B500BF6564 /* Loky.gif in Resources */, + 2E41DFD12BE2206F0012F773 /* Heart.usdz in Resources */, + 2E41DFD32BE2206F0012F773 /* Cyclops.usdz in Resources */, + 2E7D4AFC2BBADA0600DE0004 /* Metal_Round_Glasses.usdz in Resources */, + 2E41DFD62BE2206F0012F773 /* Swag.usdz in Resources */, 4969EA9920D109FD00F8AE9E /* LaunchScreen.storyboard in Resources */, 2E962C8F2B69017200D0903D /* robot_walk_idle.usdz in Resources */, 2E90B3ED2B87636800D3DB90 /* biped_robot.usdz in Resources */, + 2E41DFD22BE2206F0012F773 /* Glasses.usdz in Resources */, 4969EA9620D109FD00F8AE9E /* Assets.xcassets in Resources */, + 2E41DFD42BE2206F0012F773 /* Neon.usdz in Resources */, 4969EA9420D109FC00F8AE9E /* Main.storyboard in Resources */, + 2E41DFD52BE2206F0012F773 /* Star.usdz in Resources */, + 2E41DFD72BE2206F0012F773 /* Animoji.usdz in Resources */, ); runOnlyForDeploymentPostprocessing = 0; }; @@ -685,6 +795,7 @@ 2E7503282B30640100DF78E1 /* ARSurfaceDetectionVC.swift in Sources */, 2E7021562B95ACD600089680 /* SCNVector3+Extension.swift in Sources */, 4968C0E520D14B3200D384F0 /* AVChoiceVC.swift in Sources */, + 2E2C35C42BE215F100200E7E /* ARFaceDetection.swift in Sources */, 4969EA9120D109FC00F8AE9E /* AVSharingWorldMapVC.swift in Sources */, 4904E80F20D776B3002F5210 /* PointCloud+CreateVisualization.swift in Sources */, 2E75032D2B30913E00DF78E1 /* Utility.swift in Sources */, @@ -721,6 +832,7 @@ 4968C0DB20D12AA700D384F0 /* ThresholdPanGestureRecognizer.swift in Sources */, 4904E80C20D776B3002F5210 /* DetectedObject.swift in Sources */, 2E15A59A2B32FE85001EA792 /* ARImageDetectorVC.swift in Sources */, + 2E2C35C62BE216FC00200E7E /* ModelCollectionCell.swift in Sources */, 2E9300F92B469633002BF5D6 /* StyleTransferModel.mlpackage in Sources */, 4968C0CC20D12AA700D384F0 /* RoundedButton.swift in Sources */, ); diff --git a/iOS12_Sampler/ios12 Sampler/AVDetailsVC.swift b/iOS12_Sampler/ios12 Sampler/AVDetailsVC.swift index 71d12ff..c8885cf 100644 --- a/iOS12_Sampler/ios12 Sampler/AVDetailsVC.swift +++ b/iOS12_Sampler/ios12 Sampler/AVDetailsVC.swift @@ -32,7 +32,7 @@ class AVDetailsVC: BaseCameraVC { } @IBAction func btnSurfaceDetectionClicked(_ sender: UIButton) { - let vc = self.storyboard?.instantiateViewController(withIdentifier: "ARSurfaceDetectionVC") as? ARSurfaceDetectionVC + let vc = self.storyboard?.instantiateViewController(withIdentifier: "ARFaceDetection") as? ARFaceDetection self.navigationController?.pushViewController(vc!, animated: true) } diff --git a/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Contents.json b/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Contents.json new file mode 100644 index 0000000..73c0059 --- /dev/null +++ b/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Contents.json @@ -0,0 +1,6 @@ +{ + "info" : { + "author" : "xcode", + "version" : 1 + } +} diff --git a/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Cyclops.imageset/Contents.json b/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Cyclops.imageset/Contents.json new file mode 100644 index 0000000..b9d7b65 --- /dev/null +++ b/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Cyclops.imageset/Contents.json @@ -0,0 +1,21 @@ +{ + "images" : [ + { + "idiom" : "universal", + "scale" : "1x" + }, + { + "filename" : "Cyclops.png", + "idiom" : "universal", + "scale" : "2x" + }, + { + "idiom" : "universal", + "scale" : "3x" + } + ], + "info" : { + "author" : "xcode", + "version" : 1 + } +} diff --git a/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Cyclops.imageset/Cyclops.png b/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Cyclops.imageset/Cyclops.png new file mode 100644 index 0000000..0a1f703 Binary files /dev/null and b/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Cyclops.imageset/Cyclops.png differ diff --git a/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Glasses.imageset/Contents.json b/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Glasses.imageset/Contents.json new file mode 100644 index 0000000..668a8e6 --- /dev/null +++ b/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Glasses.imageset/Contents.json @@ -0,0 +1,21 @@ +{ + "images" : [ + { + "idiom" : "universal", + "scale" : "1x" + }, + { + "filename" : "Glasses.png", + "idiom" : "universal", + "scale" : "2x" + }, + { + "idiom" : "universal", + "scale" : "3x" + } + ], + "info" : { + "author" : "xcode", + "version" : 1 + } +} diff --git a/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Glasses.imageset/Glasses.png b/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Glasses.imageset/Glasses.png new file mode 100644 index 0000000..39071f8 Binary files /dev/null and b/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Glasses.imageset/Glasses.png differ diff --git a/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Heart.imageset/Contents.json b/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Heart.imageset/Contents.json new file mode 100644 index 0000000..ba142fa --- /dev/null +++ b/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Heart.imageset/Contents.json @@ -0,0 +1,21 @@ +{ + "images" : [ + { + "idiom" : "universal", + "scale" : "1x" + }, + { + "filename" : "heart2.png", + "idiom" : "universal", + "scale" : "2x" + }, + { + "idiom" : "universal", + "scale" : "3x" + } + ], + "info" : { + "author" : "xcode", + "version" : 1 + } +} diff --git a/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Heart.imageset/heart2.png b/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Heart.imageset/heart2.png new file mode 100644 index 0000000..9934e7d Binary files /dev/null and b/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Heart.imageset/heart2.png differ diff --git a/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Neon.imageset/Contents.json b/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Neon.imageset/Contents.json new file mode 100644 index 0000000..738922b --- /dev/null +++ b/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Neon.imageset/Contents.json @@ -0,0 +1,21 @@ +{ + "images" : [ + { + "idiom" : "universal", + "scale" : "1x" + }, + { + "filename" : "Neon.png", + "idiom" : "universal", + "scale" : "2x" + }, + { + "idiom" : "universal", + "scale" : "3x" + } + ], + "info" : { + "author" : "xcode", + "version" : 1 + } +} diff --git a/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Neon.imageset/Neon.png b/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Neon.imageset/Neon.png new file mode 100644 index 0000000..d63b03e Binary files /dev/null and b/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Neon.imageset/Neon.png differ diff --git a/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Robo.imageset/Contents.json b/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Robo.imageset/Contents.json new file mode 100644 index 0000000..fc9aa21 --- /dev/null +++ b/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Robo.imageset/Contents.json @@ -0,0 +1,21 @@ +{ + "images" : [ + { + "idiom" : "universal", + "scale" : "1x" + }, + { + "filename" : "Robo.png", + "idiom" : "universal", + "scale" : "2x" + }, + { + "idiom" : "universal", + "scale" : "3x" + } + ], + "info" : { + "author" : "xcode", + "version" : 1 + } +} diff --git a/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Robo.imageset/Robo.png b/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Robo.imageset/Robo.png new file mode 100644 index 0000000..7eb378b Binary files /dev/null and b/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Robo.imageset/Robo.png differ diff --git a/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Star.imageset/Contents.json b/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Star.imageset/Contents.json new file mode 100644 index 0000000..e2041ea --- /dev/null +++ b/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Star.imageset/Contents.json @@ -0,0 +1,21 @@ +{ + "images" : [ + { + "idiom" : "universal", + "scale" : "1x" + }, + { + "filename" : "Star.png", + "idiom" : "universal", + "scale" : "2x" + }, + { + "idiom" : "universal", + "scale" : "3x" + } + ], + "info" : { + "author" : "xcode", + "version" : 1 + } +} diff --git a/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Star.imageset/Star.png b/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Star.imageset/Star.png new file mode 100644 index 0000000..a3e472e Binary files /dev/null and b/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Star.imageset/Star.png differ diff --git a/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Swag.imageset/Contents.json b/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Swag.imageset/Contents.json new file mode 100644 index 0000000..2c8dd00 --- /dev/null +++ b/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Swag.imageset/Contents.json @@ -0,0 +1,21 @@ +{ + "images" : [ + { + "idiom" : "universal", + "scale" : "1x" + }, + { + "filename" : "Swag.png", + "idiom" : "universal", + "scale" : "2x" + }, + { + "idiom" : "universal", + "scale" : "3x" + } + ], + "info" : { + "author" : "xcode", + "version" : 1 + } +} diff --git a/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Swag.imageset/Swag.png b/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Swag.imageset/Swag.png new file mode 100644 index 0000000..e996f7e Binary files /dev/null and b/iOS12_Sampler/ios12 Sampler/Assets.xcassets/Models/Swag.imageset/Swag.png differ diff --git a/iOS12_Sampler/ios12 Sampler/Base.lproj/Main.storyboard b/iOS12_Sampler/ios12 Sampler/Base.lproj/Main.storyboard index 0647b02..392e913 100644 --- a/iOS12_Sampler/ios12 Sampler/Base.lproj/Main.storyboard +++ b/iOS12_Sampler/ios12 Sampler/Base.lproj/Main.storyboard @@ -1,9 +1,9 @@ - + - + @@ -1401,7 +1401,6 @@ Image Filter @@ -1439,7 +1438,7 @@ Image Filter diff --git a/iOS12_Sampler/ios12 Sampler/Extensions/Float+Extension.swift b/iOS12_Sampler/ios12 Sampler/Extensions/Float+Extension.swift index 31370ae..f92a00d 100644 --- a/iOS12_Sampler/ios12 Sampler/Extensions/Float+Extension.swift +++ b/iOS12_Sampler/ios12 Sampler/Extensions/Float+Extension.swift @@ -19,4 +19,8 @@ extension Float { self * 180 / .pi } } + + var toPoints: Self { + return (self * 2835) + } } diff --git a/iOS12_Sampler/ios12 Sampler/Face Detection with AR/ARFaceDetection.swift b/iOS12_Sampler/ios12 Sampler/Face Detection with AR/ARFaceDetection.swift new file mode 100644 index 0000000..9cae82a --- /dev/null +++ b/iOS12_Sampler/ios12 Sampler/Face Detection with AR/ARFaceDetection.swift @@ -0,0 +1,171 @@ +// +// ARFaceDetection.swift +// ios12 Sampler +// +// Created by Dhruvil Vora on 01/05/24. +// Copyright © 2024 Testing. All rights reserved. +// + +import RealityKit +import Vision +import ARKit +// Comment + +class ARFaceDetection: UIViewController { + + // MARK: IBOutlets + @IBOutlet var arview: ARView! + @IBOutlet weak var circleVw: UIView! + @IBOutlet weak var modelCollectionView: UICollectionView! + + // MARK: Variables + var timer: Timer? + var modelWidth: Float = 0.0 + var deviceWidth: Float = 0.0 + var currentLoadedEntity: Entity! + var parentAnchorEntity: AnchorEntity? + var models: [String] = ["Neon", "Heart", "Star", "Swag", "Glasses", "Animoji", "Cyclops"] + + // MARK: ViewDidLoad + override func viewDidLoad() { + super.viewDidLoad() + setupARConfiguration() + setupCircleView() + setUpCollectionView() + loadModel(withModelIndex: 0) + } + + private func setupARConfiguration() { + let configuaration = ARFaceTrackingConfiguration() + configuaration.isLightEstimationEnabled = true + arview.session.run(configuaration, options: [.resetTracking, .removeExistingAnchors]) + arview.session.delegate = self + } + + private func setupCircleView() { + deviceWidth = Float(UIScreen.main.bounds.size.width) + 100 + circleVw.layer.cornerRadius = (circleVw.layer.frame.width / 2) + circleVw.layer.borderWidth = 10 + circleVw.layer.borderColor = UIColor.green.cgColor + } + + private func setUpCollectionView() { + let layout: UICollectionViewFlowLayout = UICollectionViewFlowLayout() + layout.sectionInset = UIEdgeInsets(top: 0, left: ((UIScreen.main.bounds.width) / 2) - 50, + bottom: 0, right: ((UIScreen.main.bounds.width) / 2) - 50) + layout.scrollDirection = .horizontal + modelCollectionView.collectionViewLayout = layout + modelCollectionView.delegate = self + modelCollectionView.dataSource = self + modelCollectionView.reloadData() + } + + // Calculate the scaling of model using sertain params + private func newCalculateScalingModel(modelWidth: Float) -> Float { + // Convert Models width to points + let modelWidthInPts = modelWidth.toPoints + // Then need to divide device width with the converted width in points + // to get the ratio + let convertedRatio = deviceWidth/modelWidthInPts + return convertedRatio + } + + private func loadModel(withModelIndex index: Int) { + // Need to remove existing model if any is placed in ARView + parentAnchorEntity?.removeChild(currentLoadedEntity) + // load the model + let entity = try! ModelEntity.load(named: models[index]) + // Get the bounds of loaded model + let entityBounds = entity.visualBounds(relativeTo: nil) + // Get the width of model from bounding box of entity + modelWidth = entityBounds.extents.x + currentLoadedEntity = entity + // Calculate and scale the model + let scaledModelCalculation = newCalculateScalingModel(modelWidth: modelWidth) + entity.setScale([scaledModelCalculation, scaledModelCalculation, scaledModelCalculation], relativeTo: nil) + + // create a AnchorEntity tracking face + if let anchorEntity = parentAnchorEntity { + parentAnchorEntity = anchorEntity + } else { + parentAnchorEntity = AnchorEntity(.face) + } + parentAnchorEntity?.addChild(entity) + arview.scene.anchors.append(parentAnchorEntity!) + } +} + +// MARK: UIScrollViewDelegate +extension ARFaceDetection { + func scrollViewDidEndDragging(_ scrollView: UIScrollView, willDecelerate decelerate: Bool) { + if !decelerate { + stopScrolling(scrollView: scrollView) + } + } + + func scrollViewDidEndDecelerating(_ scrollView: UIScrollView) { + stopScrolling(scrollView: scrollView) + } + + private func stopScrolling(scrollView: UIScrollView) { + let modularCount = (Float(scrollView.contentOffset.x) / 95) + guard Int(round(modularCount)) <= models.count else { return } + scrollAndLoadModel(with: Int(modularCount)) + } + + private func scrollAndLoadModel(with indexToScroll: Int) { + modelCollectionView.scrollToItem(at: IndexPath(item: indexToScroll, section: 0), at: .centeredHorizontally, animated: true) + loadModel(withModelIndex: indexToScroll) + } +} + +// MARK: ARSessionDelegate +extension ARFaceDetection: ARSessionDelegate { + func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) { + for anchor in anchors { + // Track anchor of Face only & For Animoji only + guard let anchor = anchor as? ARFaceAnchor, currentLoadedEntity.findEntity(named: "Animoji") != nil else { continue } + // Get Eyebrows and Jaws displacement value + guard let browOuterUpLeftShape = anchor.blendShapes[.browOuterUpLeft]?.floatValue, + let browOuterUpRightShape = anchor.blendShapes[.browOuterUpRight]?.floatValue, + let jawShape = anchor.blendShapes[.jawOpen]?.floatValue else { continue } + + // Then find that particular entity from Animoji model + guard let leftBrowEntity = currentLoadedEntity?.findEntity(named: "left_eyebrow"), + let rightBrowEntity = currentLoadedEntity?.findEntity(named: "right_eyebrow"), + let jawEntity = currentLoadedEntity?.findEntity(named: "jaw") else { continue } + + // Change the position of that entity realtime to reflect user's expression + leftBrowEntity.position.y = 0.082 + (browOuterUpLeftShape / 10) + rightBrowEntity.position.y = 0.082 + (browOuterUpRightShape / 10) + jawEntity.position.y = -0.067 - (jawShape / 10) + } + } +} + +// MARK: UICollectionViewDataSource +extension ARFaceDetection: UICollectionViewDataSource { + func collectionView(_ collectionView: UICollectionView, numberOfItemsInSection section: Int) -> Int { + return models.count + } + + func collectionView(_ collectionView: UICollectionView, cellForItemAt indexPath: IndexPath) -> UICollectionViewCell { + guard let collectionViewCell: ModelCollectionCell = collectionView.dequeueReusableCell(withReuseIdentifier: "ModelCollectionCell", for: indexPath) as? ModelCollectionCell else { return UICollectionViewCell() } + collectionViewCell.configureCell(indexPath: indexPath.item) + return collectionViewCell + } +} + +// MARK: UICollectionViewDelegate +extension ARFaceDetection: UICollectionViewDelegate { + func collectionView(_ collectionView: UICollectionView, didSelectItemAt indexPath: IndexPath) { + scrollAndLoadModel(with: indexPath.item) + } +} + +// MARK: UICollectionViewDelegateFlowLayout +extension ARFaceDetection: UICollectionViewDelegateFlowLayout { + func collectionView(_ collectionView: UICollectionView, layout collectionViewLayout: UICollectionViewLayout, sizeForItemAt indexPath: IndexPath) -> CGSize { + CGSize(width: 100, height: 100) + } +} diff --git a/iOS12_Sampler/ios12 Sampler/Face Detection with AR/Cell/ModelCollectionCell.swift b/iOS12_Sampler/ios12 Sampler/Face Detection with AR/Cell/ModelCollectionCell.swift new file mode 100644 index 0000000..94bee33 --- /dev/null +++ b/iOS12_Sampler/ios12 Sampler/Face Detection with AR/Cell/ModelCollectionCell.swift @@ -0,0 +1,21 @@ +// +// ModelCollectionCell.swift +// ios12 Sampler +// +// Created by Dhruvil Vora on 01/05/24. +// Copyright © 2024 Testing. All rights reserved. +// + +import UIKit + +class ModelCollectionCell: UICollectionViewCell { + + var modelImgs: [String] = ["Neon", "Heart", "Star", "Swag", "Glasses", "Robo", "Cyclops"] + + @IBOutlet weak var modelImgVw: UIImageView! + + func configureCell(indexPath: Int) { + modelImgVw.image = UIImage(named: modelImgs[indexPath]) + modelImgVw.layer.cornerRadius = (modelImgVw.layer.frame.width / 2) + } +} diff --git a/iOS12_Sampler/ios12 Sampler/Tracking and altering images/Utilities/ImageTrackingUtility.swift b/iOS12_Sampler/ios12 Sampler/Tracking and altering images/Utilities/ImageTrackingUtility.swift index d051083..07f0289 100644 --- a/iOS12_Sampler/ios12 Sampler/Tracking and altering images/Utilities/ImageTrackingUtility.swift +++ b/iOS12_Sampler/ios12 Sampler/Tracking and altering images/Utilities/ImageTrackingUtility.swift @@ -91,7 +91,7 @@ func createPlaneNode(size: CGSize, rotation: Float, content: Any?) -> SCNNode { } func getFilterData() -> [FilterModel] { - return [FilterModel(filterDummyImage: UIImage(named: "random")!, filterName: "Random", isSelected: true, selectedFilterStyle: .randomStyle), + return [FilterModel(filterDummyImage: UIImage(named: "Random")!, filterName: "Random", isSelected: true, selectedFilterStyle: .randomStyle), FilterModel(filterDummyImage: UIImage(named: "style1")!, filterName: "Style1", isSelected: false), FilterModel(filterDummyImage: UIImage(named: "style2")!, filterName: "Style2", isSelected: false), FilterModel(filterDummyImage: UIImage(named: "style3")!, filterName: "Style3", isSelected: false), diff --git a/iOS12_Sampler/ios12 Sampler/USDZ/Face Models/Animoji.usdz b/iOS12_Sampler/ios12 Sampler/USDZ/Face Models/Animoji.usdz new file mode 100644 index 0000000..470c45d Binary files /dev/null and b/iOS12_Sampler/ios12 Sampler/USDZ/Face Models/Animoji.usdz differ diff --git a/iOS12_Sampler/ios12 Sampler/USDZ/Face Models/Cyclops.usdz b/iOS12_Sampler/ios12 Sampler/USDZ/Face Models/Cyclops.usdz new file mode 100644 index 0000000..32caf7f Binary files /dev/null and b/iOS12_Sampler/ios12 Sampler/USDZ/Face Models/Cyclops.usdz differ diff --git a/iOS12_Sampler/ios12 Sampler/USDZ/Face Models/Glasses.usdz b/iOS12_Sampler/ios12 Sampler/USDZ/Face Models/Glasses.usdz new file mode 100644 index 0000000..64ca484 Binary files /dev/null and b/iOS12_Sampler/ios12 Sampler/USDZ/Face Models/Glasses.usdz differ diff --git a/iOS12_Sampler/ios12 Sampler/USDZ/Face Models/Heart.usdz b/iOS12_Sampler/ios12 Sampler/USDZ/Face Models/Heart.usdz new file mode 100644 index 0000000..909ceeb Binary files /dev/null and b/iOS12_Sampler/ios12 Sampler/USDZ/Face Models/Heart.usdz differ diff --git a/iOS12_Sampler/ios12 Sampler/USDZ/Face Models/Metal_Round_Glasses.usdz b/iOS12_Sampler/ios12 Sampler/USDZ/Face Models/Metal_Round_Glasses.usdz new file mode 100644 index 0000000..f7d5cc9 Binary files /dev/null and b/iOS12_Sampler/ios12 Sampler/USDZ/Face Models/Metal_Round_Glasses.usdz differ diff --git a/iOS12_Sampler/ios12 Sampler/USDZ/Face Models/Neon.usdz b/iOS12_Sampler/ios12 Sampler/USDZ/Face Models/Neon.usdz new file mode 100644 index 0000000..6463af4 Binary files /dev/null and b/iOS12_Sampler/ios12 Sampler/USDZ/Face Models/Neon.usdz differ diff --git a/iOS12_Sampler/ios12 Sampler/USDZ/Face Models/Star.usdz b/iOS12_Sampler/ios12 Sampler/USDZ/Face Models/Star.usdz new file mode 100644 index 0000000..b2049de Binary files /dev/null and b/iOS12_Sampler/ios12 Sampler/USDZ/Face Models/Star.usdz differ diff --git a/iOS12_Sampler/ios12 Sampler/USDZ/Face Models/Swag.usdz b/iOS12_Sampler/ios12 Sampler/USDZ/Face Models/Swag.usdz new file mode 100644 index 0000000..a859572 Binary files /dev/null and b/iOS12_Sampler/ios12 Sampler/USDZ/Face Models/Swag.usdz differ diff --git a/iOS12_Sampler/ios12 Sampler/USDZ/Glasses/80s_Sunglasses_that_look_litt_it_you_ask_me.usdz b/iOS12_Sampler/ios12 Sampler/USDZ/Glasses/80s_Sunglasses_that_look_litt_it_you_ask_me.usdz new file mode 100644 index 0000000..986917a Binary files /dev/null and b/iOS12_Sampler/ios12 Sampler/USDZ/Glasses/80s_Sunglasses_that_look_litt_it_you_ask_me.usdz differ diff --git a/iOS12_Sampler/ios12 Sampler/USDZ/Glasses/Cheese_Sunglasses.usdz b/iOS12_Sampler/ios12 Sampler/USDZ/Glasses/Cheese_Sunglasses.usdz new file mode 100644 index 0000000..b69099c Binary files /dev/null and b/iOS12_Sampler/ios12 Sampler/USDZ/Glasses/Cheese_Sunglasses.usdz differ diff --git a/iOS12_Sampler/ios12 Sampler/USDZ/Glasses/Cyclops_sunglasses.usdz b/iOS12_Sampler/ios12 Sampler/USDZ/Glasses/Cyclops_sunglasses.usdz new file mode 100644 index 0000000..facfbe9 Binary files /dev/null and b/iOS12_Sampler/ios12 Sampler/USDZ/Glasses/Cyclops_sunglasses.usdz differ diff --git a/iOS12_Sampler/ios12 Sampler/USDZ/Glasses/Deal_With_It_Sunglasses.usdz b/iOS12_Sampler/ios12 Sampler/USDZ/Glasses/Deal_With_It_Sunglasses.usdz new file mode 100644 index 0000000..5f3ad52 Binary files /dev/null and b/iOS12_Sampler/ios12 Sampler/USDZ/Glasses/Deal_With_It_Sunglasses.usdz differ diff --git a/iOS12_Sampler/ios12 Sampler/USDZ/Glasses/Montgomery_Gators_Star_Sunglasses.usdz b/iOS12_Sampler/ios12 Sampler/USDZ/Glasses/Montgomery_Gators_Star_Sunglasses.usdz new file mode 100644 index 0000000..1f357f0 Binary files /dev/null and b/iOS12_Sampler/ios12 Sampler/USDZ/Glasses/Montgomery_Gators_Star_Sunglasses.usdz differ diff --git a/iOS12_Sampler/ios12 Sampler/USDZ/Glasses/Neon_Party_Glasses.usdz b/iOS12_Sampler/ios12 Sampler/USDZ/Glasses/Neon_Party_Glasses.usdz new file mode 100644 index 0000000..6797f36 Binary files /dev/null and b/iOS12_Sampler/ios12 Sampler/USDZ/Glasses/Neon_Party_Glasses.usdz differ diff --git a/iOS12_Sampler/ios12 Sampler/USDZ/Glasses/Occhiale_Goccia__Sunglasses.usdz b/iOS12_Sampler/ios12 Sampler/USDZ/Glasses/Occhiale_Goccia__Sunglasses.usdz new file mode 100644 index 0000000..6fb165f Binary files /dev/null and b/iOS12_Sampler/ios12 Sampler/USDZ/Glasses/Occhiale_Goccia__Sunglasses.usdz differ diff --git a/iOS12_Sampler/ios12 Sampler/USDZ/Glasses/Party_Glasses_Heart_type.usdz b/iOS12_Sampler/ios12 Sampler/USDZ/Glasses/Party_Glasses_Heart_type.usdz new file mode 100644 index 0000000..dd50a13 Binary files /dev/null and b/iOS12_Sampler/ios12 Sampler/USDZ/Glasses/Party_Glasses_Heart_type.usdz differ diff --git a/iOS12_Sampler/ios12 Sampler/USDZ/Glasses/Sunglasses.usdz b/iOS12_Sampler/ios12 Sampler/USDZ/Glasses/Sunglasses.usdz new file mode 100644 index 0000000..8073135 Binary files /dev/null and b/iOS12_Sampler/ios12 Sampler/USDZ/Glasses/Sunglasses.usdz differ