NetStream

open class NetStream : NSObject

The NetStream class is the foundation of a RTMPStream, HTTPStream.

  • The lockQueue.

    Declaration

    Swift

    public let lockQueue: DispatchQueue
  • The mixer object.

    Declaration

    Swift

    public private(set) var mixer: AVMixer { get }
  • Specifies the metadata for the stream.

    Declaration

    Swift

    public var metadata: [String : Any?]
  • Specifies the context object.

    Declaration

    Swift

    public var context: CIContext? { get set }
  • Specifiet the device torch indicating wheter the turn on(TRUE) or not(FALSE).

    Declaration

    Swift

    public var torch: Bool { get set }
  • Specify the video orientation for stream.

    Declaration

    Swift

    public var videoOrientation: AVCaptureVideoOrientation { get set }
  • Specify the audio compression properties.

    Declaration

    Swift

    public var audioSettings: Setting<AudioCodec, AudioCodec.Option> { get set }
  • Specify the video compression properties.

    Declaration

    Swift

    public var videoSettings: Setting<VideoCodec, VideoCodec.Option> { get set }
  • Specify the avsession properties.

    Declaration

    Swift

    open var captureSettings: Setting<AVMixer, AVMixer.Option> { get set }
  • Specifies the recorder properties.

    Declaration

    Swift

    public var recorderSettings: [AVMediaType : [String : Any]] { get set }
  • Attaches the camera object.

    Warning

    This method can’t use appendSampleBuffer at the same time.

    Declaration

    Swift

    open func attachCamera(_ camera: AVCaptureDevice?, onError: ((_ error: NSError) -> Void)? = nil)
  • Attaches the microphone object.

    Warning

    This method can’t use appendSampleBuffer at the same time.

    Declaration

    Swift

    open func attachAudio(_ audio: AVCaptureDevice?, automaticallyConfiguresApplicationAudioSession: Bool = false, onError: ((_ error: NSError) -> Void)? = nil)
  • Set the point of interest.

    Declaration

    Swift

    public func setPointOfInterest(_ focus: CGPoint, exposure: CGPoint)
  • Append a CMSampleBuffer?.

    Warning

    This method can’t use attachCamera or attachAudio method at the same time.

    Declaration

    Swift

    open func appendSampleBuffer(_ sampleBuffer: CMSampleBuffer, withType: AVMediaType, options: [NSObject : AnyObject]? = nil)
  • Register a video effect.

    Declaration

    Swift

    public func registerVideoEffect(_ effect: VideoEffect) -> Bool
  • Unregister a video effect.

    Declaration

    Swift

    public func unregisterVideoEffect(_ effect: VideoEffect) -> Bool
  • Register a audio effect.

    Declaration

    Swift

    public func registerAudioEffect(_ effect: AudioEffect) -> Bool
  • Unregister a audio effect.

    Declaration

    Swift

    public func unregisterAudioEffect(_ effect: AudioEffect) -> Bool
  • Starts recording.

    Declaration

    Swift

    public func startRecording()
  • Stop recording.

    Declaration

    Swift

    public func stopRecording()
  • Undocumented

    Declaration

    Swift

    public var orientation: AVCaptureVideoOrientation { get set }
  • Undocumented

    Declaration

    Swift

    public func attachScreen(_ screen: CaptureSessionConvertible?, useScreenSize: Bool = true)
  • Undocumented

    Declaration

    Swift

    public var zoomFactor: CGFloat { get }
  • Undocumented

    Declaration

    Swift

    public func setZoomFactor(_ zoomFactor: CGFloat, ramping: Bool = false, withRate: Float = 2.0)