Instead of using AVCaptureSession, I am just creating a CMSampleBuffer from an array of UIImage. I then pass this CMSampleBuffer to textCaptureService.add(sample) method to get it processed and extract text. But unfortunately textCaptureService doesn't do anything. I can see that the engine and the service do get created successfully but somehow the service doesn't process the buffer I am passing. Is there anyway to force the service to start after adding the sample buffer? Is the only way to make the service process the buffer is to call the add() from the captureOutput method of AVCaptureVideoDataOutputSampleBufferDelegate? Below is the snippet of my code where I am creating the buffer manually and adding it to the service:

 for i in 1...10 {

                timingInfo.presentationTimeStamp = CMTimeMake(Int64(600*i),600)

                timingInfo.duration = CMTimeMake(1,1)

                let err = CMVideoFormatDescriptionCreateForImageBuffer(kCFAllocatorDefault, pixBuf!, &videoInfo)

                print("CMF Status \(err)")

                print("CMVF debug: \(videoInfo.debugDescription)")

                let ret = CMSampleBufferCreateReadyWithImageBuffer(kCFAllocatorDefault, pixBuf!, videoInfo!, &timingInfo, &sample)

                print("Buffer debug: \(sample.debugDescription)")

                print("Status \(ret)")

                self.textCaptureService?.add(sample)

 

            }