AVFoundation-09媒体读取写入

概述

AVFoundation 是一个可以用来使用和创建基于时间的视听媒体数据的框架。AVFoundation 的构建考虑到了目前的硬件环境和应用程序,其设计过程高度依赖多线程机制。充分利用了多核硬件的优势并大量使用block和GCD机制,将复杂的计算机进程放到了后台线程运行。会自动提供硬件加速操作,确保在大部分设备上应用程序能以最佳性能运行。该框架就是针对64位处理器设计的,可以发挥64位处理器的所有优势。

iOS 媒体环境.png

AVAssetReader

AVAssetReader用于从AVAsset中读取媒体样本。通常会配置一个或多个AVAssetReaderOutput,并通过- (CMSampleBufferRef)copyNextSampleBuffer;方法访问音频样本和视频帧。以下是AVAssetReader的读取媒体数据的相关状态:

typedef NS_ENUM(NSInteger, AVAssetReaderStatus) {
    AVAssetReaderStatusUnknown = 0,
    AVAssetReaderStatusReading,
    AVAssetReaderStatusCompleted,
    AVAssetReaderStatusFailed,
    AVAssetReaderStatusCancelled,
};

AVAssetReaderOutput

AVAssetReaderOutput是一个抽象类,不过框架定义许多具体实例来从指定的AVAssetTrack中读取解码的媒体样本。从多音频轨道中读取混合输出,或从多视频轨道中读取组合输出。

AVAssetReaderOutput 家族

AVAssetWriter

AVAssetWriter是AVAssetReader的兄弟类,它的作用是对媒体文件进行编码,并写入到相关的媒体文件中,如MPEG-4文件或QuickTime文件。AVAssetWriter可以自动支持交叉媒体样本,为了保持合适的交叉模式,AVAssetWriterInput提供了一个readyForMoreMediaData属性来指示在保持所需的交叉情况下输入信息是否还可以附加更多信息。以下是AVAssetWriter的写入媒体数据的相关状态:

typedef NS_ENUM(NSInteger, AVAssetWriterStatus) {
    AVAssetWriterStatusUnknown = 0,
    AVAssetWriterStatusWriting,
    AVAssetWriterStatusCompleted,
    AVAssetWriterStatusFailed,
    AVAssetWriterStatusCancelled
};

AVAssetWriterInput

AVAssetWriterInput用于配置AVAssetWriter可以处理的媒体类型,如音频或视频。并且附加在其后的样本最终会生成一个独立的AVAssetTrack。当需要处理视频样本的时候,我们通常会使用一个适配器对象AVAssetWriterInputPixelBufferAdaptor。之所以会使用它,是因为它在处理视频的时候具有最优的性能。

资源读取

1、新建QMMediaReader类用于读取媒体文件;
2、在- (void)startProcessing方法中使用- (void)loadValuesAsynchronouslyForKeys:(NSArray<NSString *> *)keys completionHandler:(nullable void (^)(void))handler;异步加载tracks属性;
3、加载成功后,创建AVAssetReader;
4、创建AVAssetReaderTrackOutput,创建时分别设置音视频的解码格式为kCVPixelFormatType_32BGRAkAudioFormatLinearPCM
5、当AVAssetReader状态为AVAssetReaderStatusReading循环读取媒体数据,并回调CMSampleBufferRef给消费者。
6、当AVAssetReader读取结束后取消读取,并回调消费者。

//
//  QMMediaReader.h
//  AVFoundation
//
//  Created by mac on 17/8/28.
//  Copyright © 2017年 Qinmin. All rights reserved.
//

#import <AVFoundation/AVFoundation.h>

@interface QMMediaReader : NSObject
@property (nonatomic, strong) void(^videoReaderCallback)(CMSampleBufferRef videoBuffer);
@property (nonatomic, strong) void(^audioReaderCallback)(CMSampleBufferRef audioBuffer);
@property (nonatomic, strong) void(^readerCompleteCallback)(void);

- (instancetype)initWithAsset:(AVAsset *)asset;
- (instancetype)initWithURL:(NSURL *)url;

- (void)startProcessing;
- (void)cancelProcessing;
@end

//
//  QMMediaReader.m
//  AVFoundation
//
//  Created by mac on 17/8/28.
//  Copyright © 2017年 Qinmin. All rights reserved.
//

#import "QMMediaReader.h"
#import <CoreVideo/CoreVideo.h>
#import <AVFoundation/AVFoundation.h>

@interface QMMediaReader ()
@property (nonatomic, strong) NSURL *url;
@property (nonatomic, strong) AVAsset *asset;
@property (nonatomic, strong) AVAssetReader *reader;
@property (nonatomic, assign) CMTime previousFrameTime;
@property (nonatomic, assign) CFAbsoluteTime previousActualFrameTime;
@end

@implementation QMMediaReader

- (id)initWithURL:(NSURL *)url;
{
    if (self = [super init])
    {
        self.url = url;
        self.asset = nil;
    }
    
    return self;
}

- (id)initWithAsset:(AVAsset *)asset;
{
    if (self = [super init])
    {
        self.url = nil;
        self.asset = asset;
    }
    
    return self;
}

#pragma mark - cancelProcessing
- (void)cancelProcessing
{
    if (self.reader) {
        [self.reader cancelReading];
    }
}

#pragma mark - startProcessing
- (void)startProcessing
{
    _previousFrameTime = kCMTimeZero;
    _previousActualFrameTime = CFAbsoluteTimeGetCurrent();
    
    NSDictionary *inputOptions = @{AVURLAssetPreferPreciseDurationAndTimingKey : @(YES)};
    self.asset = [[AVURLAsset alloc] initWithURL:self.url options:inputOptions];
    
    __weak typeof(self) weakSelf = self;
    [self.asset loadValuesAsynchronouslyForKeys:@[@"tracks"] completionHandler: ^{
        dispatch_async(dispatch_get_global_queue(0, 0), ^{
            NSError *error = nil;
            AVKeyValueStatus tracksStatus = [weakSelf.asset statusOfValueForKey:@"tracks" error:&error];
            if (tracksStatus != AVKeyValueStatusLoaded) {
                return;
            }
            [weakSelf processAsset];
        });
    }];
}

- (AVAssetReader *)createAssetReader
{
    NSError *error = nil;
    AVAssetReader *assetReader = [AVAssetReader assetReaderWithAsset:self.asset error:&error];
    
    // Video
    NSArray *videoTracks = [self.asset tracksWithMediaType:AVMediaTypeVideo];
    BOOL shouldRecordVideoTrack = [videoTracks count] > 0;
    AVAssetReaderTrackOutput *readerVideoTrackOutput = nil;
    if (shouldRecordVideoTrack) {
        AVAssetTrack* videoTrack = [videoTracks firstObject];
        NSDictionary *outputSettings = @{
                                         (id)kCVPixelBufferPixelFormatTypeKey:@(kCVPixelFormatType_32BGRA)
                                         };
        readerVideoTrackOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:videoTrack outputSettings:outputSettings];
        readerVideoTrackOutput.alwaysCopiesSampleData = NO;
        [assetReader addOutput:readerVideoTrackOutput];
    }
    
    // Audio
    NSArray *audioTracks = [self.asset tracksWithMediaType:AVMediaTypeAudio];
    BOOL shouldRecordAudioTrack = [audioTracks count] > 0;
    AVAssetReaderTrackOutput *readerAudioTrackOutput = nil;
    
    if (shouldRecordAudioTrack)
    {
        AVAssetTrack* audioTrack = [audioTracks firstObject];
        NSDictionary *audioOutputSetting = @{
                                             AVFormatIDKey : @(kAudioFormatLinearPCM),
                                             AVNumberOfChannelsKey : @(2),
                                             };

        readerAudioTrackOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:audioTrack outputSettings:audioOutputSetting];
        readerAudioTrackOutput.alwaysCopiesSampleData = NO;
        [assetReader addOutput:readerAudioTrackOutput];
    }
    
    return assetReader;
}

- (void)processAsset
{
    self.reader = [self createAssetReader];
    
    AVAssetReaderOutput *readerVideoTrackOutput = nil;
    AVAssetReaderOutput *readerAudioTrackOutput = nil;
    
    for( AVAssetReaderOutput *output in self.reader.outputs ) {
        if( [output.mediaType isEqualToString:AVMediaTypeAudio] ) {
            readerAudioTrackOutput = output;
        }else if( [output.mediaType isEqualToString:AVMediaTypeVideo] ) {
            readerVideoTrackOutput = output;
        }
    }
    
    if ([self.reader startReading] == NO) {
        NSLog(@"Error reading from file at URL: %@", self.url);
        return;
    }
    
    while (self.reader.status == AVAssetReaderStatusReading ) {
        if (readerVideoTrackOutput) {
            [self readNextVideoFrameFromOutput:readerVideoTrackOutput];
        }
        
        if (readerAudioTrackOutput) {
            [self readNextAudioSampleFromOutput:readerAudioTrackOutput];
        }
    }
    
    if (self.reader.status == AVAssetReaderStatusCompleted) {
        [self.reader cancelReading];
        if (self.readerCompleteCallback) {
            self.readerCompleteCallback();
        }
    }
    
}

- (void)readNextVideoFrameFromOutput:(AVAssetReaderOutput *)readerVideoTrackOutput;
{
    if (self.reader.status == AVAssetReaderStatusReading)
    {
        CMSampleBufferRef sampleBufferRef = [readerVideoTrackOutput copyNextSampleBuffer];
        if (sampleBufferRef)
        {
            //NSLog(@"read a video frame: %@", CFBridgingRelease(CMTimeCopyDescription(kCFAllocatorDefault, CMSampleBufferGetOutputPresentationTimeStamp(sampleBufferRef))));
            
            BOOL playAtActualSpeed = YES;
            if (playAtActualSpeed) {
                // Do this outside of the video processing queue to not slow that down while waiting
                CMTime currentSampleTime = CMSampleBufferGetOutputPresentationTimeStamp(sampleBufferRef);
                CMTime differenceFromLastFrame = CMTimeSubtract(currentSampleTime, _previousFrameTime);
                CFAbsoluteTime currentActualTime = CFAbsoluteTimeGetCurrent();
                
                CGFloat frameTimeDifference = CMTimeGetSeconds(differenceFromLastFrame);
                CGFloat actualTimeDifference = currentActualTime - _previousActualFrameTime;
                
                if (frameTimeDifference > actualTimeDifference)
                {
                    usleep(1000000.0 * (frameTimeDifference - actualTimeDifference));
                }
                
                _previousFrameTime = currentSampleTime;
                _previousActualFrameTime = CFAbsoluteTimeGetCurrent();
            }
            
            if (self.videoReaderCallback) {
                self.videoReaderCallback(sampleBufferRef);
            }
            CMSampleBufferInvalidate(sampleBufferRef);
            CFRelease(sampleBufferRef);
        }
    }
}

- (void)readNextAudioSampleFromOutput:(AVAssetReaderOutput *)readerAudioTrackOutput;
{
    if (self.reader.status == AVAssetReaderStatusReading)
    {
        CMSampleBufferRef audioSampleBufferRef = [readerAudioTrackOutput copyNextSampleBuffer];
        if (audioSampleBufferRef)
        {
            //NSLog(@"read an audio frame: %@", CFBridgingRelease(CMTimeCopyDescription(kCFAllocatorDefault, CMSampleBufferGetOutputPresentationTimeStamp(audioSampleBufferRef))));
            if (self.audioReaderCallback) {
                self.audioReaderCallback(audioSampleBufferRef);
            }
            CFRelease(audioSampleBufferRef);
        }
    }
}

@end

资源写入

1、新建MediaWriter类用于保存媒体文件;
2、根据传入的输出路径、文件类型创建AVAssetWriter;
3、创建音视频AVAssetWriterInput,并设置相关的编码格式为AVVideoCodecH264和kAudioFormatMPEG4AAC;
4、当AVAssetWriter状态为AVAssetWriterStatusWriting的时候追加写入音视频数据,当AVAssetWriterInput不为readyForMoreMediaData的时候则等待;
5、结束的时候调用AVAssetWriter的- (void)finishWritingWithCompletionHandler:

//
//  QMMediaWriter.h
//  AVFoundation
//
//  Created by mac on 17/8/28.
//  Copyright © 2017年 Qinmin. All rights reserved.
//

#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>

@interface QMMediaWriter : NSObject
- (instancetype)initWithOutputURL:(NSURL *)URL size:(CGSize)newSize;
- (instancetype)initWithOutputURL:(NSURL *)URL size:(CGSize)newSize fileType:(NSString *)newFileType;

- (void)processVideoBuffer:(CMSampleBufferRef)videoBuffer;
- (void)processAudioBuffer:(CMSampleBufferRef)audioBuffer;

- (void)finishWriting;
@end
//
//  QMMediaWriter.m
//  AVFoundation
//
//  Created by mac on 17/8/28.
//  Copyright © 2017年 Qinmin. All rights reserved.
//

#import "QMMediaWriter.h"
#import <CoreVideo/CoreVideo.h>

@interface QMMediaWriter ()
@property (nonatomic, strong) AVAssetWriter *assetWriter;
@property (nonatomic, strong) AVAssetWriterInput *assetWriterAudioInput;
@property (nonatomic, strong) AVAssetWriterInput *assetWriterVideoInput;
@property (nonatomic, strong) AVAssetWriterInputPixelBufferAdaptor *assetWriterPixelBufferInput;
@property (nonatomic, assign) BOOL encodingLiveVideo;
@property (nonatomic, assign) CGSize videoSize;
@property (nonatomic, assign) CMTime startTime;
@end

@implementation QMMediaWriter

- (instancetype)initWithOutputURL:(NSURL *)URL size:(CGSize)newSize
{
    return [self initWithOutputURL:URL size:newSize fileType:AVFileTypeQuickTimeMovie];
}

- (instancetype)initWithOutputURL:(NSURL *)URL size:(CGSize)newSize fileType:(NSString *)newFileType
{
    if (self = [super init]) {
        _videoSize = newSize;
        _startTime = kCMTimeInvalid;
        _encodingLiveVideo = YES;
        
        [self buildAssetWriterWithURL:URL fileType:newFileType];
        [self buildVideoWriter];
        [self buildAudioWriter];
    }
    return self;
}

#pragma mark - AVAssetWriter
- (void)buildAssetWriterWithURL:(NSURL *)url fileType:(NSString *)fileType
{
    NSError *error;
    self.assetWriter = [AVAssetWriter assetWriterWithURL:url fileType:fileType error:&error];
    if (error) {
        NSLog(@"%@", [error localizedDescription]);
        exit(0);
    }
    self.assetWriter.movieFragmentInterval = CMTimeMakeWithSeconds(1.0, 1000);
}

- (void)buildVideoWriter
{
    NSDictionary *dict = @{
                           AVVideoWidthKey:@(_videoSize.width),
                           AVVideoHeightKey:@(_videoSize.height),
                           AVVideoCodecKey:AVVideoCodecH264
                           };
    self.assetWriterVideoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:dict];
    self.assetWriterVideoInput.expectsMediaDataInRealTime = _encodingLiveVideo;
    
    NSDictionary *attributesDictionary = @{
                                           (id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA),
                                           (id)kCVPixelBufferWidthKey : @(_videoSize.width),
                                           (id)kCVPixelBufferHeightKey : @(_videoSize.height)
                                           };
    
    self.assetWriterPixelBufferInput = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:self.assetWriterVideoInput sourcePixelBufferAttributes:attributesDictionary];
    
    [self.assetWriter addInput:self.assetWriterVideoInput];
}

- (void)buildAudioWriter
{
    NSDictionary *audioOutputSettings = @{
                                          AVFormatIDKey : @(kAudioFormatMPEG4AAC),
                                          AVNumberOfChannelsKey : @(2),
                                          AVSampleRateKey : @(48000),
                                          };
    
    self.assetWriterAudioInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:audioOutputSettings];
    
    [self.assetWriter addInput:self.assetWriterAudioInput];
    self.assetWriterAudioInput.expectsMediaDataInRealTime = _encodingLiveVideo;
}

#pragma mark - AudioBuffer
- (void)processVideoBuffer:(CMSampleBufferRef)videoBuffer
{
    if (!CMSampleBufferIsValid(videoBuffer)) {
        return;
    }
    
    CFRetain(videoBuffer);
    CMTime currentSampleTime = CMSampleBufferGetOutputPresentationTimeStamp(videoBuffer);
    
    if (CMTIME_IS_INVALID(_startTime))
    {
        if (self.assetWriter.status != AVAssetWriterStatusWriting)
        {
            [self.assetWriter startWriting];
        }
        
        [self.assetWriter startSessionAtSourceTime:currentSampleTime];
        _startTime = currentSampleTime;
    }

    while(!self.assetWriterVideoInput.readyForMoreMediaData && !_encodingLiveVideo) {
        NSDate *maxDate = [NSDate dateWithTimeIntervalSinceNow:0.1];
        [[NSRunLoop currentRunLoop] runUntilDate:maxDate];
    }
    
    NSLog(@"video => %ld %@", (long)self.assetWriter.status, [self.assetWriter.error localizedDescription]);
    
    if (!self.assetWriterVideoInput.readyForMoreMediaData) {
        NSLog(@"had to drop a video frame");
        
    } else if(self.assetWriter.status == AVAssetWriterStatusWriting) {
        CVImageBufferRef cvimgRef = CMSampleBufferGetImageBuffer(videoBuffer);
        if (![self.assetWriterPixelBufferInput appendPixelBuffer:cvimgRef withPresentationTime:currentSampleTime]) {
            NSLog(@"appending pixel fail");
        }
    } else {
        NSLog(@"write frame fail");
    }
    
    CFRelease(videoBuffer);
}

- (void)processAudioBuffer:(CMSampleBufferRef)audioBuffer;
{
    if (!CMSampleBufferIsValid(audioBuffer)) {
        return;
    }
    
    CFRetain(audioBuffer);
    CMTime currentSampleTime = CMSampleBufferGetOutputPresentationTimeStamp(audioBuffer);
    
    if (CMTIME_IS_INVALID(_startTime))
    {
        if (self.assetWriter.status != AVAssetWriterStatusWriting)
        {
            [self.assetWriter startWriting];
        }
        
        [self.assetWriter startSessionAtSourceTime:currentSampleTime];
        _startTime = currentSampleTime;
    }
    
    NSLog(@"audio => %ld %@", (long)self.assetWriter.status, [self.assetWriter.error localizedDescription]);
    
    while(!self.assetWriterAudioInput.readyForMoreMediaData && ! _encodingLiveVideo) {
        NSDate *maxDate = [NSDate dateWithTimeIntervalSinceNow:0.5];
        [[NSRunLoop currentRunLoop] runUntilDate:maxDate];
    }
    
    if (!self.assetWriterAudioInput.readyForMoreMediaData) {
        NSLog(@"had to drop an audio frame");
    } else if(self.assetWriter.status == AVAssetWriterStatusWriting) {
        if (![self.assetWriterAudioInput appendSampleBuffer:audioBuffer]) {
           NSLog(@"appending audio buffer fail");
        }
    } else {
        NSLog(@"write audio frame fail");
    }
    
    CFRelease(audioBuffer);
}

- (void)finishWriting
{
    if (self.assetWriter.status == AVAssetWriterStatusCompleted || self.assetWriter.status == AVAssetWriterStatusCancelled || self.assetWriter.status == AVAssetWriterStatusUnknown) {
        return;
    }
    
    if(self.assetWriter.status == AVAssetWriterStatusWriting) {
        [self.assetWriterVideoInput markAsFinished];
    }
    
    if(self.assetWriter.status == AVAssetWriterStatusWriting) {
        [self.assetWriterAudioInput markAsFinished];
    }
    
    [self.assetWriter finishWritingWithCompletionHandler:^{
        
    }];
    
}
@end

使用MediaReader和MediaWriter

_mediaWriter = [[QMMediaWriter alloc] initWithOutputURL:[NSURL fileURLWithPath:kDocumentPath(@"1.mp4")] size:CGSizeMake(640, 360)];
_mediaReader = [[QMMediaReader alloc] initWithURL:[[NSBundle mainBundle] URLForResource:@"1" withExtension:@"mp4"]];

__weak typeof(self) weakSelf = self;
[_mediaReader setVideoReaderCallback:^(CMSampleBufferRef videoBuffer) {
    [weakSelf.mediaWriter processVideoBuffer:videoBuffer];
}];

[_mediaReader setAudioReaderCallback:^(CMSampleBufferRef audioBuffer) {
    [weakSelf.mediaWriter processAudioBuffer:audioBuffer];
}];

[_mediaReader setReaderCompleteCallback:^{
    NSLog(@"==finish===");
    [weakSelf.mediaWriter finishWriting];
}];

[_mediaReader startProcessing];

参考

AVFoundation开发秘籍:实践掌握iOS & OSX应用的视听处理技术

源码地址:AVFoundation开发 https://github.com/QinminiOS/AVFoundation

最后编辑于
©著作权归作者所有,转载或内容合作请联系作者
  • 序言:七十年代末,一起剥皮案震惊了整个滨河市,随后出现的几起案子,更是在滨河造成了极大的恐慌,老刑警刘岩,带你破解...
    沈念sama阅读 206,126评论 6 481
  • 序言:滨河连续发生了三起死亡事件,死亡现场离奇诡异,居然都是意外死亡,警方通过查阅死者的电脑和手机,发现死者居然都...
    沈念sama阅读 88,254评论 2 382
  • 文/潘晓璐 我一进店门,熙熙楼的掌柜王于贵愁眉苦脸地迎上来,“玉大人,你说我怎么就摊上这事。” “怎么了?”我有些...
    开封第一讲书人阅读 152,445评论 0 341
  • 文/不坏的土叔 我叫张陵,是天一观的道长。 经常有香客问我,道长,这世上最难降的妖魔是什么? 我笑而不...
    开封第一讲书人阅读 55,185评论 1 278
  • 正文 为了忘掉前任,我火速办了婚礼,结果婚礼上,老公的妹妹穿的比我还像新娘。我一直安慰自己,他们只是感情好,可当我...
    茶点故事阅读 64,178评论 5 371
  • 文/花漫 我一把揭开白布。 她就那样静静地躺着,像睡着了一般。 火红的嫁衣衬着肌肤如雪。 梳的纹丝不乱的头发上,一...
    开封第一讲书人阅读 48,970评论 1 284
  • 那天,我揣着相机与录音,去河边找鬼。 笑死,一个胖子当着我的面吹牛,可吹牛的内容都是我干的。 我是一名探鬼主播,决...
    沈念sama阅读 38,276评论 3 399
  • 文/苍兰香墨 我猛地睁开眼,长吁一口气:“原来是场噩梦啊……” “哼!你这毒妇竟也来了?” 一声冷哼从身侧响起,我...
    开封第一讲书人阅读 36,927评论 0 259
  • 序言:老挝万荣一对情侣失踪,失踪者是张志新(化名)和其女友刘颖,没想到半个月后,有当地人在树林里发现了一具尸体,经...
    沈念sama阅读 43,400评论 1 300
  • 正文 独居荒郊野岭守林人离奇死亡,尸身上长有42处带血的脓包…… 初始之章·张勋 以下内容为张勋视角 年9月15日...
    茶点故事阅读 35,883评论 2 323
  • 正文 我和宋清朗相恋三年,在试婚纱的时候发现自己被绿了。 大学时的朋友给我发了我未婚夫和他白月光在一起吃饭的照片。...
    茶点故事阅读 37,997评论 1 333
  • 序言:一个原本活蹦乱跳的男人离奇死亡,死状恐怖,灵堂内的尸体忽然破棺而出,到底是诈尸还是另有隐情,我是刑警宁泽,带...
    沈念sama阅读 33,646评论 4 322
  • 正文 年R本政府宣布,位于F岛的核电站,受9级特大地震影响,放射性物质发生泄漏。R本人自食恶果不足惜,却给世界环境...
    茶点故事阅读 39,213评论 3 307
  • 文/蒙蒙 一、第九天 我趴在偏房一处隐蔽的房顶上张望。 院中可真热闹,春花似锦、人声如沸。这庄子的主人今日做“春日...
    开封第一讲书人阅读 30,204评论 0 19
  • 文/苍兰香墨 我抬头看了看天上的太阳。三九已至,却和暖如春,着一层夹袄步出监牢的瞬间,已是汗流浃背。 一阵脚步声响...
    开封第一讲书人阅读 31,423评论 1 260
  • 我被黑心中介骗来泰国打工, 没想到刚下飞机就差点儿被人妖公主榨干…… 1. 我叫王不留,地道东北人。 一个月前我还...
    沈念sama阅读 45,423评论 2 352
  • 正文 我出身青楼,却偏偏与公主长得像,于是被迫代替她去往敌国和亲。 传闻我的和亲对象是个残疾皇子,可洞房花烛夜当晚...
    茶点故事阅读 42,722评论 2 345

推荐阅读更多精彩内容