IOS开发:iPod的音乐库中的音频如何上传到服务器中

最近在做的项目里有一个功能,就是拿到手机媒体库中的音频文件,并实现APP中的播放,已经转成MP3格式上传到服务器上。

首先是要能获取到ipod library中的音频。这里我用的是MPMediaQuery,在这里有一个坑,可以拿到的歌曲中有很多歌曲的AssetURL为nil,这是因为MPMediaQuery只能拿到用户通过itunes倒入进去的歌曲url。
代码如下:

   // 1.创建媒体选择队列(从ipod库中读出音乐文件) 
MPMediaQuery*everything = [[MPMediaQueryalloc]init];
 // 2.创建读取条件(类似于对数据做一个筛选)  Value:作用等同于MPMediaType枚值
MPMediaPropertyPredicate*albumNamePredicate =
[MPMediaPropertyPredicate predicateWithValue:[NSNumber numberWithInt:MPMediaTypeMusic ] forProperty: MPMediaItemPropertyMediaType];
//3.给队列添加读取条件
  [everythingaddFilterPredicate:albumNamePredicate];
 //4.从队列中获取符合条件的数组集合
   NSArray*itemsFromGenericQuery = [everythingitems];
  //5.便利解析数据
 for(MPMediaItem*songinitemsFromGenericQuery) {
  //歌曲名字
   > NSString *songTitle = [song valueForProperty: MPMediaItemPropertyTitle];
    //歌曲路径
    NSString *url = [song valueForProperty: MPMediaItemPropertyAssetURL];
    //歌手
     NSString *songer = [song valueForProperty: MPMediaItemPropertyArtist];
  }

拿到后肯定是实现APP中的播放,这个就很简单了。采用AVAudioPlayer就好了

接着就是重点了。项目需要把该音频上传到服务器,但是拿到的AssetURL并不支持后台上传,它是这样的:pod-library://item/item.m4a?id=8461911140233300129***

经过一番查阅,最终找到一个办法,就是先把它转为caf 写到内存里,然后再把.caf转为.mp3 就可以上传了。这里caf转mp3需要使用第三方库:lame (lame三库的资源)

代码如下:
1.导出成caf格式,这种导出方式,文件名必须以.caf作为后缀

- (void)convertToCAF:(NSString *)songUrl name:(NSString *)songName
{
dispatch_time_t popTime = dispatch_time(DISPATCH_TIME_NOW, 0.1 * NSEC_PER_SEC);
dispatch_after(popTime, dispatch_get_main_queue(), ^(void){
    [UsingHUD showInView:self.view text:@"正在处理,请稍等..."];
});
NSURL *url = [NSURL URLWithString:songUrl];
//由于中文歌名不行 这边采用时间戳
NSString *fileName      = [NSString stringWithFormat:@"%@.caf",[self getTimestamp]];

AVURLAsset *songAsset = [AVURLAsset URLAssetWithURL:url options:nil];

NSError *assetError = nil;
AVAssetReader *assetReader = [AVAssetReader assetReaderWithAsset:songAsset
                                                            error:&assetError];
if (assetError) {
    NSLog (@"error: %@", assetError);
    return;
}

AVAssetReaderOutput *assetReaderOutput = [AVAssetReaderAudioMixOutput
                                           assetReaderAudioMixOutputWithAudioTracks:songAsset.tracks
                                           audioSettings: nil];
if (! [assetReader canAddOutput: assetReaderOutput]) {
    NSLog (@"can't add reader output... die!");
    return;
}
[assetReader addOutput: assetReaderOutput];

NSArray *dirs = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectoryPath = [dirs objectAtIndex:0];
NSString *exportPath = [documentsDirectoryPath stringByAppendingPathComponent:fileName];
if ([[NSFileManager defaultManager] fileExistsAtPath:exportPath]) {
    [[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil];
}
NSURL *exportURL = [NSURL fileURLWithPath:exportPath];
AVAssetWriter *assetWriter = [AVAssetWriter assetWriterWithURL:exportURL
                                                       fileType:AVFileTypeCoreAudioFormat
                                                          error:&assetError];
if (assetError) {
    NSLog (@"error: %@", assetError);
    return;
}
AudioChannelLayout channelLayout;
memset(&channelLayout, 0, sizeof(AudioChannelLayout));
channelLayout.mChannelLayoutTag = kAudioChannelLayoutTag_Stereo;
NSDictionary *outputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                [NSNumber numberWithInt:kAudioFormatLinearPCM], AVFormatIDKey,
                                [NSNumber numberWithFloat:44100.0], AVSampleRateKey,
                                [NSNumber numberWithInt:2], AVNumberOfChannelsKey,
                                [NSData dataWithBytes:&channelLayout length:sizeof(AudioChannelLayout)], AVChannelLayoutKey,
                                [NSNumber numberWithInt:16], AVLinearPCMBitDepthKey,
                                [NSNumber numberWithBool:NO], AVLinearPCMIsNonInterleaved,
                                [NSNumber numberWithBool:NO],AVLinearPCMIsFloatKey,
                                [NSNumber numberWithBool:NO], AVLinearPCMIsBigEndianKey,
                                nil];
AVAssetWriterInput *assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio
                                                                           outputSettings:outputSettings];
if ([assetWriter canAddInput:assetWriterInput]) {
    [assetWriter addInput:assetWriterInput];
} else {
    NSLog (@"can't add asset writer input... die!");
    return;
}

assetWriterInput.expectsMediaDataInRealTime = NO;

[assetWriter startWriting];
[assetReader startReading];

AVAssetTrack *soundTrack = [songAsset.tracks objectAtIndex:0];
CMTime startTime = CMTimeMake (0, soundTrack.naturalTimeScale);
[assetWriter startSessionAtSourceTime: startTime];

__block UInt64 convertedByteCount = 0;

dispatch_queue_t mediaInputQueue = dispatch_queue_create("mediaInputQueue", NULL);
[assetWriterInput requestMediaDataWhenReadyOnQueue:mediaInputQueue
                                        usingBlock: ^
 {
     // NSLog (@"top of block");
     while (assetWriterInput.readyForMoreMediaData) {
         CMSampleBufferRef nextBuffer = [assetReaderOutput copyNextSampleBuffer];
         if (nextBuffer) {
             // append buffer
             [assetWriterInput appendSampleBuffer: nextBuffer];
             //             NSLog (@"appended a buffer (%d bytes)",
             //                    CMSampleBufferGetTotalSampleSize (nextBuffer));
             convertedByteCount += CMSampleBufferGetTotalSampleSize (nextBuffer);
             
             
         } else {
             // done!
             [assetWriterInput markAsFinished];
             [assetWriter finishWriting];
             [assetReader cancelReading];
             NSDictionary *outputFileAttributes = [[NSFileManager defaultManager]
                                                   attributesOfItemAtPath:exportPath
                                                   error:nil];
             //实现caf 转mp3
             [self audioCAFtoMP3:exportPath];
      
             NSLog (@"done. file size is %lld",
                    [outputFileAttributes fileSize]);
             break;
         }
     }
     
 }];
}

2.caf >mp3

- (void)audioCAFtoMP3:(NSString *)wavPath {

NSString *cafFilePath = wavPath;

NSString *mp3FilePath = [NSString stringWithFormat:@"%@.mp3",[NSString stringWithFormat:@"%@",[cafFilePath substringToIndex:cafFilePath.length - 4]]];

@try {
    int read, write;
    
    FILE *pcm = fopen([cafFilePath cStringUsingEncoding:1], "rb");  //source 被转换的音频文件位置
    fseek(pcm, 4*1024, SEEK_CUR);                                   //skip file header
    FILE *mp3 = fopen([mp3FilePath cStringUsingEncoding:1], "wb");  //output 输出生成的Mp3文件位置
    
    const int PCM_SIZE = 8192;
    const int MP3_SIZE = 8192;
    short int pcm_buffer[PCM_SIZE*2];
    unsigned char mp3_buffer[MP3_SIZE];
    
    lame_t lame = lame_init();
    lame_set_num_channels(lame,1);//设置1为单通道,默认为2双通道
    lame_set_in_samplerate(lame, 44100.0);
    lame_set_VBR(lame, vbr_default);
    
    lame_set_brate(lame,8);
    
    lame_set_mode(lame,3);
    
    lame_set_quality(lame,2);
    
    lame_init_params(lame);
    
    do {
        read = fread(pcm_buffer, 2*sizeof(short int), PCM_SIZE, pcm);
        if (read == 0)
            write = lame_encode_flush(lame, mp3_buffer, MP3_SIZE);
        else
            write = lame_encode_buffer_interleaved(lame, pcm_buffer, read, mp3_buffer, MP3_SIZE);
        
        fwrite(mp3_buffer, write, 1, mp3);
        
    } while (read != 0);
    
    lame_close(lame);
    fclose(mp3);
    fclose(pcm);
}
@catch (NSException *exception) {
    NSLog(@"%@",[exception description]);
}
@finally {
    dispatch_time_t popTime = dispatch_time(DISPATCH_TIME_NOW, 0.1 * NSEC_PER_SEC);
    dispatch_after(popTime, dispatch_get_main_queue(), ^(void){
        [UsingHUD hideInView:self.view];
        //主线程上传到oss
        [self localdispose:mp3FilePath];
    });

}
}

到此,大功告成~~

你可能感兴趣的:(IOS开发:iPod的音乐库中的音频如何上传到服务器中)