Skip to content

kimp67/realtime-stream-processor

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

7 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

Realtime Stream Processor

Camera ํŒจํ‚ค์ง€์™€ Google ML Kit์„ ์‚ฌ์šฉํ•œ ์‹ค์‹œ๊ฐ„ ์ŠคํŠธ๋ฆฌ๋ฐ ํ”„๋ ˆ์ž„ ๋ถ„ํ•  ๋ฐ ์ฒ˜๋ฆฌ ํด๋ž˜์Šค

๐Ÿ“‹ ๊ฐœ์š”

์ด ๋ผ์ด๋ธŒ๋Ÿฌ๋ฆฌ๋Š” ์นด๋ฉ”๋ผ๋กœ๋ถ€ํ„ฐ ์‹ค์‹œ๊ฐ„์œผ๋กœ ํ”„๋ ˆ์ž„์„ ์ˆ˜์‹ ํ•˜๊ณ , ๋‚ด๋ถ€ ํ๋ฅผ ํ†ตํ•ด ๋น„๋™๊ธฐ์ ์œผ๋กœ ์ฒ˜๋ฆฌํ•˜๋Š” ํšจ์œจ์ ์ธ ์ŠคํŠธ๋ฆผ ํ”„๋กœ์„ธ์„œ๋ฅผ ์ œ๊ณตํ•ฉ๋‹ˆ๋‹ค.

โœจ ์ฃผ์š” ๊ธฐ๋Šฅ

  • โœ… ์‹ค์‹œ๊ฐ„ ์นด๋ฉ”๋ผ ์ŠคํŠธ๋ฆฌ๋ฐ: Camera ํŒจํ‚ค์ง€๋ฅผ ์‚ฌ์šฉํ•œ ํ”„๋ ˆ์ž„ ์บก์ฒ˜
  • โœ… ์ž…๋ ฅ ํ ๊ด€๋ฆฌ: ํ”„๋ ˆ์ž„์„ ํ์— ์ €์žฅํ•˜๊ณ  ์ˆœ์ฐจ์ ์œผ๋กœ ์ฒ˜๋ฆฌ
  • โœ… ML Kit ํ†ตํ•ฉ: Google ML Kit InputImage ์ž๋™ ๋ณ€ํ™˜
  • โœ… ๋น„๋™๊ธฐ ํ”„๋ ˆ์ž„ ์ฒ˜๋ฆฌ: ๋ฐฑ๊ทธ๋ผ์šด๋“œ์—์„œ ํ”„๋ ˆ์ž„ ์ฒ˜๋ฆฌ
  • โœ… ํ ์˜ค๋ฒ„ํ”Œ๋กœ์šฐ ๋ฐฉ์ง€: ์ตœ๋Œ€ ํ ํฌ๊ธฐ ์„ค์ • ๋ฐ ์ž๋™ ํ”„๋ ˆ์ž„ ๋“œ๋กญ
  • โœ… ์ผ์‹œ์ •์ง€/์žฌ๊ฐœ: ์ŠคํŠธ๋ฆผ ์ œ์–ด ๊ธฐ๋Šฅ
  • โœ… ํ†ต๊ณ„ ๋ชจ๋‹ˆํ„ฐ๋ง: FPS, ์ฒ˜๋ฆฌ๋œ ํ”„๋ ˆ์ž„, ๋“œ๋กญ๋œ ํ”„๋ ˆ์ž„ ๋“ฑ
  • โœ… ์—๋Ÿฌ ํ•ธ๋“ค๋ง: ๋ณ„๋„์˜ ์—๋Ÿฌ ์ŠคํŠธ๋ฆผ ์ œ๊ณต
  • โœ… ์ปค์Šคํ„ฐ๋งˆ์ด์ง•: ์‚ฌ์šฉ์ž ์ •์˜ ํ”„๋ ˆ์ž„ ์ฒ˜๋ฆฌ ํ•จ์ˆ˜ ์ง€์›

๐Ÿ“ฆ ์„ค์น˜

pubspec.yaml์— ๋‹ค์Œ ์˜์กด์„ฑ์„ ์ถ”๊ฐ€ํ•˜์„ธ์š”:

dependencies:
  camera: ^0.10.5+7
  google_mlkit_commons: ^0.6.0
  image: ^4.1.3

๐Ÿš€ ๋น ๋ฅธ ์‹œ์ž‘

1. ๊ธฐ๋ณธ ์‚ฌ์šฉ๋ฒ•

import 'package:realtime_stream_processor/realtime_stream_processor.dart';

// ํ”„๋กœ์„ธ์„œ ์ƒ์„ฑ
final processor = RealtimeStreamProcessor(
  maxQueueSize: 10, // ์ตœ๋Œ€ ํ ํฌ๊ธฐ
);

// ์นด๋ฉ”๋ผ ์ดˆ๊ธฐํ™”
await processor.initializeCamera(
  resolutionPreset: ResolutionPreset.medium,
);

// ์ŠคํŠธ๋ฆฌ๋ฐ ์‹œ์ž‘
await processor.startStreaming();

// ๊ฒฐ๊ณผ ์ˆ˜์‹ 
processor.outputStream.listen((result) {
  print('Processed frame ${result.frameId}');
  print('Metadata: ${result.metadata}');
  
  // ML Kit InputImage ์‚ฌ์šฉ
  if (result.mlKitInputImage != null) {
    // ML Kit ๋ชจ๋ธ๋กœ ์ฒ˜๋ฆฌ
    // await faceDetector.processImage(result.mlKitInputImage!);
  }
});

// ์—๋Ÿฌ ์ฒ˜๋ฆฌ
processor.errorStream.listen((error) {
  print('Error: $error');
});

// ์ŠคํŠธ๋ฆฌ๋ฐ ์ค‘์ง€
await processor.stopStreaming();

// ๋ฆฌ์†Œ์Šค ์ •๋ฆฌ
await processor.dispose();

2. ์ปค์Šคํ…€ ํ”„๋ ˆ์ž„ ์ฒ˜๋ฆฌ

final processor = RealtimeStreamProcessor(
  maxQueueSize: 10,
  onProcessFrame: (frameData) async {
    // ์ปค์Šคํ…€ ์ฒ˜๋ฆฌ ๋กœ์ง
    final cameraImage = frameData.cameraImage;
    
    // ์˜ˆ: ์–ผ๊ตด ์ธ์‹, ๊ฐ์ฒด ๊ฐ์ง€, ์„ธ๊ทธ๋ฉ˜ํ…Œ์ด์…˜ ๋“ฑ
    // final faces = await faceDetector.processImage(inputImage);
    
    return ProcessedFrameResult(
      frameId: frameData.frameId,
      processedAt: DateTime.now(),
      metadata: {
        'width': cameraImage.width,
        'height': cameraImage.height,
        'customData': 'your_data',
      },
    );
  },
);

3. ์ŠคํŠธ๋ฆผ ์ œ์–ด

// ์ผ์‹œ์ •์ง€
processor.pause();

// ์žฌ๊ฐœ
processor.resume();

// ์ค‘์ง€
await processor.stopStreaming();

// ํ†ต๊ณ„ ํ™•์ธ
final stats = processor.statistics;
print('FPS: ${stats['fps']}');
print('Processed: ${stats['processedFrames']}');
print('Dropped: ${stats['droppedFrames']}');

๐Ÿ—๏ธ ์•„ํ‚คํ…์ฒ˜

โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚   Camera    โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”˜
       โ”‚ Stream
       โ–ผ
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚ Input Queue โ”‚ โ—„โ”€โ”€ maxQueueSize๋กœ ์ œํ•œ
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”˜
       โ”‚ Pop
       โ–ผ
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚  Process    โ”‚ โ—„โ”€โ”€ onProcessFrame ์ฝœ๋ฐฑ
โ”‚   Frame     โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”˜
       โ”‚
       โ–ผ
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚ Output      โ”‚
โ”‚ Stream      โ”‚ โ”€โ”€โ–บ outputStream.listen()
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜

๐Ÿ“Š ํด๋ž˜์Šค ๊ตฌ์กฐ

RealtimeStreamProcessor

์ฃผ์š” ํ”„๋กœ์„ธ์„œ ํด๋ž˜์Šค

์ƒ์„ฑ์ž

RealtimeStreamProcessor({
  int maxQueueSize = 10,
  Future<ProcessedFrameResult> Function(FrameData)? onProcessFrame,
})

์ฃผ์š” ๋ฉ”์„œ๋“œ

๋ฉ”์„œ๋“œ ์„ค๋ช…
initializeCamera() ์นด๋ฉ”๋ผ ์ดˆ๊ธฐํ™”
startStreaming() ์ŠคํŠธ๋ฆฌ๋ฐ ์‹œ์ž‘
stopStreaming() ์ŠคํŠธ๋ฆฌ๋ฐ ์ค‘์ง€
pause() ์ผ์‹œ์ •์ง€
resume() ์žฌ๊ฐœ
dispose() ๋ฆฌ์†Œ์Šค ์ •๋ฆฌ
enqueueFrame() ์ˆ˜๋™ ํ”„๋ ˆ์ž„ ์ถ”๊ฐ€ (ํ…Œ์ŠคํŠธ์šฉ)

์ฃผ์š” ์†์„ฑ

์†์„ฑ ํƒ€์ž… ์„ค๋ช…
outputStream Stream<ProcessedFrameResult> ์ฒ˜๋ฆฌ๋œ ํ”„๋ ˆ์ž„ ์ŠคํŠธ๋ฆผ
errorStream Stream<Exception> ์—๋Ÿฌ ์ŠคํŠธ๋ฆผ
queueSize int ํ˜„์žฌ ํ ํฌ๊ธฐ
isProcessing bool ์ฒ˜๋ฆฌ ์ค‘ ์—ฌ๋ถ€
isPaused bool ์ผ์‹œ์ •์ง€ ์ƒํƒœ
statistics Map<String, dynamic> ํ†ต๊ณ„ ์ •๋ณด

FrameData

์ž…๋ ฅ ํ”„๋ ˆ์ž„ ๋ฐ์ดํ„ฐ

class FrameData {
  final CameraImage cameraImage;
  final DateTime timestamp;
  final int frameId;
}

ProcessedFrameResult

์ฒ˜๋ฆฌ๋œ ํ”„๋ ˆ์ž„ ๊ฒฐ๊ณผ

class ProcessedFrameResult {
  final int frameId;
  final DateTime processedAt;
  final Uint8List? processedImageBytes;
  final Map<String, dynamic> metadata;
  final InputImage? mlKitInputImage;
}

๐ŸŽฏ ์‚ฌ์šฉ ์‚ฌ๋ก€

1. ์‹ค์‹œ๊ฐ„ ์–ผ๊ตด ์ธ์‹

import 'package:google_mlkit_face_detection/google_mlkit_face_detection.dart';

final faceDetector = FaceDetector(
  options: FaceDetectorOptions(enableLandmarks: true),
);

final processor = RealtimeStreamProcessor(
  onProcessFrame: (frameData) async {
    final inputImage = _convertToInputImage(frameData.cameraImage);
    final faces = await faceDetector.processImage(inputImage);
    
    return ProcessedFrameResult(
      frameId: frameData.frameId,
      processedAt: DateTime.now(),
      metadata: {
        'faceCount': faces.length,
        'faces': faces.map((f) => f.boundingBox).toList(),
      },
      mlKitInputImage: inputImage,
    );
  },
);

2. ์‹ค์‹œ๊ฐ„ ํ…์ŠคํŠธ ์ธ์‹

import 'package:google_mlkit_text_recognition/google_mlkit_text_recognition.dart';

final textRecognizer = TextRecognizer();

final processor = RealtimeStreamProcessor(
  onProcessFrame: (frameData) async {
    final inputImage = _convertToInputImage(frameData.cameraImage);
    final recognizedText = await textRecognizer.processImage(inputImage);
    
    return ProcessedFrameResult(
      frameId: frameData.frameId,
      processedAt: DateTime.now(),
      metadata: {
        'text': recognizedText.text,
        'blocks': recognizedText.blocks.length,
      },
      mlKitInputImage: inputImage,
    );
  },
);

3. ์‹ค์‹œ๊ฐ„ ๊ฐ์ฒด ๊ฐ์ง€

import 'package:google_mlkit_object_detection/google_mlkit_object_detection.dart';

final objectDetector = ObjectDetector(
  options: ObjectDetectorOptions(mode: DetectionMode.stream),
);

final processor = RealtimeStreamProcessor(
  onProcessFrame: (frameData) async {
    final inputImage = _convertToInputImage(frameData.cameraImage);
    final objects = await objectDetector.processImage(inputImage);
    
    return ProcessedFrameResult(
      frameId: frameData.frameId,
      processedAt: DateTime.now(),
      metadata: {
        'objectCount': objects.length,
        'objects': objects.map((o) => o.labels).toList(),
      },
      mlKitInputImage: inputImage,
    );
  },
);

โš™๏ธ ์„ค์ • ์˜ต์…˜

์นด๋ฉ”๋ผ ์„ค์ •

await processor.initializeCamera(
  cameraDescription: cameras[0],  // ํŠน์ • ์นด๋ฉ”๋ผ ์„ ํƒ
  resolutionPreset: ResolutionPreset.high,  // ํ•ด์ƒ๋„
  enableAudio: false,  // ์˜ค๋””์˜ค ๋น„ํ™œ์„ฑํ™”
);

ํ ํฌ๊ธฐ ์กฐ์ •

// ํ ํฌ๊ธฐ๊ฐ€ ํด์ˆ˜๋ก ๋” ๋งŽ์€ ํ”„๋ ˆ์ž„์„ ๋ฒ„ํผ๋ง
// ํ•˜์ง€๋งŒ ๋ฉ”๋ชจ๋ฆฌ ์‚ฌ์šฉ๋Ÿ‰ ์ฆ๊ฐ€ ๋ฐ ์ง€์—ฐ ์‹œ๊ฐ„ ์ฆ๊ฐ€
final processor = RealtimeStreamProcessor(
  maxQueueSize: 20,  // ๊ธฐ๋ณธ๊ฐ’: 10
);

๐Ÿ“ˆ ์„ฑ๋Šฅ ์ตœ์ ํ™”

1. ์ ์ ˆํ•œ ํ•ด์ƒ๋„ ์„ ํƒ

// ๋‚ฎ์€ ํ•ด์ƒ๋„ = ๋น ๋ฅธ ์ฒ˜๋ฆฌ
ResolutionPreset.low      // 352x288
ResolutionPreset.medium   // 720x480
ResolutionPreset.high     // 1280x720
ResolutionPreset.veryHigh // 1920x1080

2. ํ ํฌ๊ธฐ ์กฐ์ ˆ

  • ์ž‘์€ ํ (5-10): ๋‚ฎ์€ ์ง€์—ฐ, ๋” ๋งŽ์€ ํ”„๋ ˆ์ž„ ๋“œ๋กญ
  • ํฐ ํ (20-30): ๋†’์€ ์ง€์—ฐ, ์ ์€ ํ”„๋ ˆ์ž„ ๋“œ๋กญ

3. ์ฒ˜๋ฆฌ ์‹œ๊ฐ„ ๋ชจ๋‹ˆํ„ฐ๋ง

processor.outputStream.listen((result) {
  final processingTime = result.processedAt.difference(
    result.metadata['timestamp'] as DateTime,
  );
  print('Processing time: ${processingTime.inMilliseconds}ms');
});

๐Ÿ› ๋ฌธ์ œ ํ•ด๊ฒฐ

์นด๋ฉ”๋ผ๊ฐ€ ์ดˆ๊ธฐํ™”๋˜์ง€ ์•Š์Œ

// ๊ถŒํ•œ ํ™•์ธ
// AndroidManifest.xml์— ์ถ”๊ฐ€
<uses-permission android:name="android.permission.CAMERA"/>

// iOS Info.plist์— ์ถ”๊ฐ€
<key>NSCameraUsageDescription</key>
<string>์นด๋ฉ”๋ผ ์ ‘๊ทผ์ด ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค</string>

ํ”„๋ ˆ์ž„์ด ๊ณผ๋„ํ•˜๊ฒŒ ๋“œ๋กญ๋จ

// 1. ํ•ด์ƒ๋„ ๋‚ฎ์ถ”๊ธฐ
ResolutionPreset.low

// 2. ํ ํฌ๊ธฐ ์ฆ๊ฐ€
maxQueueSize: 20

// 3. ์ฒ˜๋ฆฌ ๋กœ์ง ์ตœ์ ํ™”
// ๋ฌด๊ฑฐ์šด ์—ฐ์‚ฐ์€ isolate๋กœ ๋ถ„๋ฆฌ

๋ฉ”๋ชจ๋ฆฌ ๋ถ€์กฑ

// 1. ํ ํฌ๊ธฐ ๊ฐ์†Œ
maxQueueSize: 5

// 2. ์ •๊ธฐ์ ์œผ๋กœ dispose() ํ˜ธ์ถœ
await processor.dispose();
processor = RealtimeStreamProcessor(...);

๐Ÿ“„ ๋ผ์ด์„ ์Šค

MIT License

๐Ÿ‘จโ€๐Ÿ’ป ๊ฐœ๋ฐœ์ž

GenSpark AI Developer

๐Ÿ™ ๊ฐ์‚ฌ์˜ ๋ง

  • Flutter Camera Package
  • Google ML Kit
  • Dart Image Package

About

Real-time camera frame processor with queue-based buffering and Google ML Kit integration. Built with Dart/Flutter for efficient stream processing and computer vision applications.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages