Show bounding box while detecting object using ARKit 2

前端 未结 1 563
再見小時候
再見小時候 2021-02-04 22:39

I have scanned and trained multiple real world objects. I do have the ARReferenceObject and the app detects them fine.

The issue that I\'m facing is when a

1条回答
  •  逝去的感伤
    2021-02-04 23:28

    It is possible to show a boundingBox in regard to the ARReferenceObject prior to it being detected; although I am not sure why you would want to do that (in advance anyway).

    For example, assuming your referenceObject was on a horizontal surface you would first need to place your estimated bounding box on the plane (or use some other method to place it in advance), and in the time it took to detect the ARPlaneAnchor and place the boundingBox it is most likely that your model would already have been detected.

    Possible Approach:

    As you are no doubt aware an ARReferenceObject has a center, extent and scale property as well as a set of rawFeaturePoints associated with the object.

    As such we can create our own boundingBox node based on some of the sample code from Apple in Scanning & Detecting 3D Objects and create our own SCNNode which will display a bounding box of the approximate size of the ARReferenceObject which is stored locally prior to it being detected.

    Note you will need to locate the 'wireframe_shader' from the Apple Sample Code for the boundingBox to render transparent:

    import Foundation
    import ARKit
    import SceneKit
    
    class BlackMirrorzBoundingBox: SCNNode {
    
        //-----------------------
        // MARK: - Initialization
        //-----------------------
    
        /// Creates A WireFrame Bounding Box From The Data Retrieved From The ARReferenceObject
        ///
        /// - Parameters:
        ///   - points: [float3]
        ///   - scale: CGFloat
        ///   - color: UIColor
        init(points: [float3], scale: CGFloat, color: UIColor = .cyan) {
            super.init()
    
            var localMin = float3(Float.greatestFiniteMagnitude)
            var localMax = float3(-Float.greatestFiniteMagnitude)
    
            for point in points {
                localMin = min(localMin, point)
                localMax = max(localMax, point)
            }
    
            self.simdPosition += (localMax + localMin) / 2
            let extent = localMax - localMin
    
            let wireFrame = SCNNode()
            let box = SCNBox(width: CGFloat(extent.x), height: CGFloat(extent.y), length: CGFloat(extent.z), chamferRadius: 0)
            box.firstMaterial?.diffuse.contents = color
            box.firstMaterial?.isDoubleSided = true
            wireFrame.geometry = box
            setupShaderOnGeometry(box)
            self.addChildNode(wireFrame)
        }
    
        required init?(coder aDecoder: NSCoder) { fatalError("init(coder:) Has Not Been Implemented") }
    
        //----------------
        // MARK: - Shaders
        //----------------
    
        /// Sets A Shader To Render The Cube As A Wireframe
        ///
        /// - Parameter geometry: SCNBox
        func setupShaderOnGeometry(_ geometry: SCNBox) {
            guard let path = Bundle.main.path(forResource: "wireframe_shader", ofType: "metal", inDirectory: "art.scnassets"),
                let shader = try? String(contentsOfFile: path, encoding: .utf8) else {
    
                    return
            }
    
            geometry.firstMaterial?.shaderModifiers = [.surface: shader]
        }
    
    }
    

    To display the bounding box you you would then do something like the following, noting that in my example I have the following variables:

     @IBOutlet var augmentedRealityView: ARSCNView!
     let configuration = ARWorldTrackingConfiguration()
     let augmentedRealitySession = ARSession()
    

    To display the boundingBox prior to detection of the actual object itself, you would call the func loadBoundigBox in viewDidLoad e.g:

    /// Creates A Bounding Box From The Data Available From The ARObject In The Local Bundle
    func loadBoundingBox(){
    
        //1. Run Our Session
        augmentedRealityView.session = augmentedRealitySession
        augmentedRealityView.delegate = self
    
        //2. Load A Single ARReferenceObject From The Main Bundle
        if let objectURL =  Bundle.main.url(forResource: "fox", withExtension: ".arobject"){
    
            do{
                var referenceObjects = [ARReferenceObject]()
                let object = try ARReferenceObject(archiveURL: objectURL)
    
                //3. Log it's Properties
                print("""
                    Object Center = \(object.center)
                    Object Extent = \(object.extent)
                    Object Scale = \(object.scale)
                    """)
    
                //4. Get It's Scale
                let scale = CGFloat(object.scale.x)
    
                //5. Create A Bounding Box
                let boundingBoxNode = BlackMirrorzBoundingBox(points: object.rawFeaturePoints.points, scale: scale)
    
                //6. Add It To The ARSCNView
                self.augmentedRealityView.scene.rootNode.addChildNode(boundingBoxNode)
    
                //7. Position It 0.5m Away From The Camera
                boundingBoxNode.position = SCNVector3(0, -0.5, -0.5)
    
                //8. Add It To The Configuration
                referenceObjects.append(object)
                configuration.detectionObjects = Set(referenceObjects)
    
            }catch{
                print(error)
            }
    
        }
    
        //9. Run The Session
        augmentedRealitySession.run(configuration, options: [.resetTracking, .removeExistingAnchors])
        augmentedRealityView.automaticallyUpdatesLighting = true
    }
    

    The above example simple creates a boundingBox from the non-detected ARReferenceObject and places it 0.5m down from and 0.5meter away from the Camera which yields something like this:

    You would of course need to handle the position of the boundBox initially, as well as hoe to handle the removal of the boundingBox 'indicator'.

    The method below simply shows a boundBox when the actual object is detected e.g:

    //--------------------------
    // MARK: - ARSCNViewDelegate
    //--------------------------
    
    extension ViewController: ARSCNViewDelegate{
    
        func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
    
            //1. Check We Have A Valid ARObject Anchor 
            guard let objectAnchor = anchor as? ARObjectAnchor else { return }
    
            //2. Create A Bounding Box Around Our Object
            let scale = CGFloat(objectAnchor.referenceObject.scale.x)
            let boundingBoxNode = BlackMirrorzBoundingBox(points: objectAnchor.referenceObject.rawFeaturePoints.points, scale: scale)
            node.addChildNode(boundingBoxNode)
    
        }
    
    }
    

    Which yields something like this:

    In regard to the detection timer, there is an example in the Apple Sample Code, which displays how long it takes to detect the model.

    In its crudest form (not accounting for milliseconds) you can do something like so:

    Firstly create A Timer and a var to store the detection time e.g:

    var detectionTimer = Timer()
    
    var detectionTime: Int = 0
    

    Then when you run your ARSessionConfiguration initialise the timer e.g:

    /// Starts The Detection Timer
    func startDetectionTimer(){
    
         detectionTimer = Timer.scheduledTimer(timeInterval: 1.0, target: self, selector: #selector(logDetectionTime), userInfo: nil, repeats: true)
    }
    
    /// Increments The Total Detection Time Before The ARReference Object Is Detected
    @objc func logDetectionTime(){
        detectionTime += 1
    
    }
    

    Then when an ARReferenceObject has been detected invalidate the timer and log the time e.g:

    //--------------------------
    // MARK: - ARSCNViewDelegate
    //--------------------------
    
    extension ViewController: ARSCNViewDelegate{
    
        func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
    
            //1. Check We Have A Valid ARObject Anchor
            guard let _ = anchor as? ARObjectAnchor else { return }
    
            //2. Stop The Timer
            detectionTimer.invalidate()
    
            //3. Log The Detection Time
            print("Total Detection Time = \(detectionTime) Seconds")
    
            //4. Reset The Detection Time
            detectionTime = 0
    
        }
    
    }
    

    This should be more than enough to get your started...

    And please note, that this example doesn't provide a boundingBox when scanning an object (look at the Apple Sample Code for that), it provides one based on an existing ARReferenceObject which is implied in your question (assuming I interpreted it correctly).

    0 讨论(0)
提交回复
热议问题