iOS实现后台推送语音播报+推送后台唤醒后台蓝牙打印

最近入职的公司是做聚合支付的,主要面对的用户都是商家,所以当顾客扫码付款的时候,即使商家的APP在后台也要进行收款金额的播报和小票的蓝牙打印(我进来的时候APP都存在两年了都没这俩功能,表示很逗逼),所以研究了一下后台推送的语音播报功能和推送后台唤醒后台蓝牙打印,在这里做个笔记。

1.APP设置

Drawing

2.后台语音播报

1.语音播报我在这是使用的是iOS10之后的新的推送功能扩展叫UNNotificationServiceExtension具体使用方法这里稍微简述一下。
Xcode工具栏选择File->New->Target->NotificationServiceExtension点击创建。

注意点:
一、此tag的Bundle ID命名方式是主项目的Bundle ID的最后加上自定义的名字,举个例子

1
2
主Bundle ID 'com.demaxia.alibaba'
NotificationServiceExtension Tag的Bundle ID 'com.demaxia.alibaba.pandatv'

二、别忘记了选择NotificationServiceExtension Tag的支持的最低版本,因为新创建的Tag默认是最新的系统版本,但是这个功能支持的最低版本是10.0。
选择版本的地方在General的Deployment里面。(我之前就忘记选择了,导致最后找了半天原因….)

三、推送内容关键字

1
2
3
4
5
6
7
aps =     {
alert = "实收0.01元,订单金额0.01元"
category = "iOS category";
"content-available" = 1;
"mutable-content" = 1;
sound = "cash_success.m4a";
};

说明一下,"mutable-content" = 1;这个推送关键字是告诉系统推送过来的内容允许咱们在NotificationServiceExtension里面修改(比如:加标题啦,修改标题了),这里不细说,反正我这里没改。

"content-available" = 1;这个是后台推送唤醒必须要加的字段,不加无反应。加了之后后台收到推送响应的方法是

1
2

-(void)application:(UIApplication *)application didReceiveRemoteNotification:(NSDictionary *)userInfo fetchCompletionHandler:(void (^)(UIBackgroundFetchResult))completionHandler

还有很重要的一点,一定要在设置里面选择应用开启后台刷新模式,不然拔了数据线在后台测试是不会有任何反应的,谨记,谨记😭。

我这里推送的语音播报是用的文件合成的,文件是偷得支付宝的语音文件。想获取支付宝的文件的可以下载一个PP助手,在里面下载支付宝就可以获得api包了,后面的操作就不说了,都知道的,这里我直接贴出我的代码,写的不好多多见谅。

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
#define kFileManager [NSFileManager defaultManager]

#define MONEYVOICR_DIC @{@"1":@"tts_1",@"2":@"tts_2",@"3":@"tts_3",@"4":@"tts_4",@"5":@"tts_5",@"6":@"tts_6",@"7":@"tts_7",@"8":@"tts_8",@"9":@"tts_9",@"10":@"tts_ten",@"100":@"tts_hundred",@"1000":@"tts_thousand",@"10000":@"tts_ten_thousand",@"0":@"tts_0",@".":@"tts_dot",@"yuan":@"tts_yuan",@"dingdan":@"middle",@"shishou":@"pre",@"vipcard":@"vipcard",@"wx":@"wx",@"zfb":@"zfb"}

#import "NotificationService.h"
#import <AVFoundation/AVFoundation.h>

typedef void(^PlayVoiceBlock)(void);
@interface NotificationService ()<AVSpeechSynthesizerDelegate,AVAudioPlayerDelegate>
{
AVSpeechSynthesizer *synthesizer;
}
@property (nonatomic, strong) void (^contentHandler)(UNNotificationContent *contentToDeliver);
@property (nonatomic, strong) UNMutableNotificationContent *bestAttemptContent;
// AVSpeechSynthesisVoice 播放完毕之后的回调block
@property (nonatomic, copy)PlayVoiceBlock finshBlock;

// AVAudioPlayer 播放完毕之后的回调block
@property (nonatomic, copy)PlayVoiceBlock audioPlayerfinshBlock;

@property (nonatomic, strong) AVAudioPlayer *player;
@property (nonatomic, strong) NSMutableArray *firlArray;
@property (nonatomic, strong) NSString *filePath;
@end

@implementation NotificationService

- (void)didReceiveNotificationRequest:(UNNotificationRequest *)request withContentHandler:(void (^)(UNNotificationContent * _Nonnull))contentHandler {
self.contentHandler = contentHandler;
self.bestAttemptContent = [request.content mutableCopy];

// Modify the notification content here...

//self.bestAttemptContent.title = [NSString stringWithFormat:@"%@ [modified]", self.bestAttemptContent.title];
if ([[self.bestAttemptContent.userInfo objectForKey:@"type"] intValue]==2||[[self.bestAttemptContent.userInfo objectForKey:@"type"] intValue]==3) {

NSString *strData = [self.bestAttemptContent.userInfo objectForKey:@"data"];
if (strData.length > 0){
NSData *data = [strData dataUsingEncoding:NSUTF8StringEncoding];
NSDictionary *dictData = [NSJSONSerialization JSONObjectWithData:data options:0 error:nil];
int payType = [[dictData objectForKey:@"pay_type"] intValue];

if (payType == 1 || payType ==3) {
_firlArray = [[NSMutableArray alloc] init];
[_firlArray addObject:MONEYVOICR_DIC[@"wx"]];
__weak __typeof(self)weakSelf = self;
BOOL isPlay = [NSString stringWithFormat:@"%@",[[self.bestAttemptContent.userInfo objectForKey:@"aps"] objectForKey:@"sound"]].length > 0;
if (isPlay) {
[self hechengVoiceWithFinshBlock:[NSString stringWithFormat:@"%@",[dictData objectForKey:@"sum_amount"]] finshBlock:^{
weakSelf.contentHandler(weakSelf.bestAttemptContent);
}];
}else{
self.contentHandler(self.bestAttemptContent);
}
}else if (payType == 2 || payType ==4){
_firlArray = [[NSMutableArray alloc] init];
[_firlArray addObject:MONEYVOICR_DIC[@"zfb"]];
__weak __typeof(self)weakSelf = self;
BOOL isPlay = [NSString stringWithFormat:@"%@",[[self.bestAttemptContent.userInfo objectForKey:@"aps"] objectForKey:@"sound"]].length > 0;
if (isPlay) {
[self hechengVoiceWithFinshBlock:[NSString stringWithFormat:@"%@",[dictData objectForKey:@"sum_amount"]] finshBlock:^{
weakSelf.contentHandler(weakSelf.bestAttemptContent);
}];
}else{
self.contentHandler(self.bestAttemptContent);
}
}else if (payType == 5){
_firlArray = [[NSMutableArray alloc] init];
[_firlArray addObject:MONEYVOICR_DIC[@"vipcard"]];
__weak __typeof(self)weakSelf = self;
BOOL isPlay = [NSString stringWithFormat:@"%@",[[self.bestAttemptContent.userInfo objectForKey:@"aps"] objectForKey:@"sound"]].length > 0;
if (isPlay) {
[self hechengVoiceWithFinshBlock:[NSString stringWithFormat:@"%@",[dictData objectForKey:@"sum_amount"]] finshBlock:^{
weakSelf.contentHandler(weakSelf.bestAttemptContent);
}];
}else{
self.contentHandler(self.bestAttemptContent);
}
}else{
// 解析推送自定义参数userInfo
NSString *alertStr = self.bestAttemptContent.userInfo[@"aps"][@"alert"];
__weak __typeof(self)weakSelf = self;
BOOL isPlay = [NSString stringWithFormat:@"%@",[[self.bestAttemptContent.userInfo objectForKey:@"aps"] objectForKey:@"sound"]].length > 0;
if (isPlay) {
[self playVoiceWithAVSpeechSynthesisVoiceWithContent:alertStr fishBlock:^{
weakSelf.contentHandler(weakSelf.bestAttemptContent);
}];
}else{
self.contentHandler(self.bestAttemptContent);
}
}
}
NSLog(@"________________%@",self.bestAttemptContent.userInfo);
}else{
self.contentHandler(self.bestAttemptContent);
}
}

- (void)serviceExtensionTimeWillExpire {
// Called just before the extension will be terminated by the system.
// Use this as an opportunity to deliver your "best attempt" at modified content, otherwise the original push payload will be used.
self.contentHandler(self.bestAttemptContent);
}
#pragma mark- AVSpeechSynthesisVoice文字转语音进行播放,成功
- (void)playVoiceWithAVSpeechSynthesisVoiceWithContent:(NSString *)content fishBlock:(PlayVoiceBlock)finshBlock
{
if (content.length == 0) {
return;
}
if (finshBlock) {
self.finshBlock = finshBlock;
}

// 创建嗓音,指定嗓音不存在则返回nil
AVSpeechSynthesisVoice *voice = [AVSpeechSynthesisVoice voiceWithLanguage:@"zh-CN"];

// 创建语音合成器
synthesizer = [[AVSpeechSynthesizer alloc] init];
synthesizer.delegate = self;
// 实例化发声的对象
AVSpeechUtterance *utterance = [AVSpeechUtterance speechUtteranceWithString:content];
utterance.voice = voice;
utterance.rate = 0.5f; // 语速
utterance.volume = 1.0f;

// 朗读的内容
[synthesizer speakUtterance:utterance];
}

- (void)speechSynthesizer:(AVSpeechSynthesizer *)synthesizer didStartSpeechUtterance:(AVSpeechUtterance *)utterance
{
NSLog(@"开始");
}
- (void)speechSynthesizer:(AVSpeechSynthesizer *)synthesizer didFinishSpeechUtterance:(AVSpeechUtterance *)utterance
{
self.finshBlock();
NSLog(@"结束");
}
- (void)speechSynthesizer:(AVSpeechSynthesizer *)synthesizer didPauseSpeechUtterance:(AVSpeechUtterance *)utterance
{

}
- (void)speechSynthesizer:(AVSpeechSynthesizer *)synthesizer didContinueSpeechUtterance:(AVSpeechUtterance *)utterance
{

}
- (void)speechSynthesizer:(AVSpeechSynthesizer *)synthesizer didCancelSpeechUtterance:(AVSpeechUtterance *)utterance
{

}

- (NSString *)filePath {
if (!_filePath) {
_filePath = [NSSearchPathForDirectoriesInDomains(NSLibraryDirectory, NSUserDomainMask, YES) firstObject];
NSString *folderName = [_filePath stringByAppendingPathComponent:@"MergeAudio"];
BOOL isCreateSuccess = [kFileManager createDirectoryAtPath:folderName withIntermediateDirectories:YES attributes:nil error:nil];
if (isCreateSuccess) _filePath = [folderName stringByAppendingPathComponent:@"compoundvoice.m4a"];
}
return _filePath;
}
#pragma mark- 合成音频使用AudioServicesCreateSystemSoundID播放
- (void)hechengVoiceWithFinshBlock:(NSString *)shuStr finshBlock:(PlayVoiceBlock )block{
/************************合成音频并播放*****************************/
NSArray *array = [shuStr componentsSeparatedByString:@"."];
if (array.count>1) {
[self formattingFileName:array];
[_firlArray addObject:MONEYVOICR_DIC[@"."]];
NSString *xiaoStr = array[1];
NSString *temp =nil;
for (int i = 0; i<xiaoStr.length; i++) {
temp = [xiaoStr substringWithRange:NSMakeRange(i,1)];
[_firlArray addObject:MONEYVOICR_DIC[temp]];
}
[_firlArray addObject:MONEYVOICR_DIC[@"yuan"]];
}else{
[self formattingFileName:array];
}

AVMutableComposition *composition = [AVMutableComposition composition];

CMTime allTime = kCMTimeZero;

for (NSInteger i = 0; i < _firlArray.count; i++) {
NSString *auidoPath = [[NSBundle mainBundle] pathForResource:[NSString stringWithFormat:@"%@",_firlArray[i]] ofType:@"mp3"];
AVURLAsset *audioAsset = [AVURLAsset assetWithURL:[NSURL fileURLWithPath:auidoPath]];

// 音频轨道
AVMutableCompositionTrack *audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];

CMTimeRange audio_timeRange = CMTimeRangeMake(kCMTimeZero, audioAsset.duration);
//CMTimeMake(int64_t value, int32_t timescale)

// 音频素材轨道
AVAssetTrack *audioAssetTrack = [[audioAsset tracksWithMediaType:AVMediaTypeAudio] firstObject];

// 音频合并 - 插入音轨文件
// 参数说明:
// insertTimeRange:源录音文件的的区间
// ofTrack:插入音频的内容
// atTime:源音频插入到目标文件开始时间
// error: 插入失败记录错误
// 返回:YES表示插入成功,`NO`表示插入失败
BOOL success = [audioTrack insertTimeRange:audio_timeRange ofTrack:audioAssetTrack atTime:allTime error:nil];

if (!success) {
NSLog(@"插入音频失败");
return;
}
// 更新当前的位置
allTime = CMTimeAdd(allTime, audioAsset.duration);

}

// 合并后的文件导出 - `presetName`要和之后的`session.outputFileType`相对应。
AVAssetExportSession *session = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetAppleM4A];
NSString *outPutFilePath = [[self.filePath stringByDeletingLastPathComponent] stringByAppendingPathComponent:@"compoundvoice.m4a"];

if ([[NSFileManager defaultManager] fileExistsAtPath:outPutFilePath]) {
[[NSFileManager defaultManager] removeItemAtPath:outPutFilePath error:nil];
}

// 查看当前session支持的fileType类型
session.outputURL = [NSURL fileURLWithPath:outPutFilePath];
session.outputFileType = AVFileTypeAppleM4A; //与上述的`present`相对应
session.shouldOptimizeForNetworkUse = YES; //优化网络

[session exportAsynchronouslyWithCompletionHandler:^{
if (session.status == AVAssetExportSessionStatusCompleted) {
NSLog(@"合并成功----%@", outPutFilePath);
NSURL *url = [NSURL fileURLWithPath:outPutFilePath];
self.player = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:nil];
self.player.numberOfLoops = 0;
self.player.delegate = self;
self.player.volume = 1.0;
[self.player prepareToPlay];
[self.player play];
if (block) {
self.audioPlayerfinshBlock = block;
}
} else {
if (block) {
self.audioPlayerfinshBlock = block;
}
}
}];
/************************合成音频并播放*****************************/
}
// 当播放完成时执行的代理
- (void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)flag {
self.audioPlayerfinshBlock();
}
// 当播放发生错误时调用
- (void)audioPlayerDecodeErrorDidOccur:(AVAudioPlayer *)player error:(NSError * __nullable)error {
self.audioPlayerfinshBlock();
}

- (void)formattingFileName:(NSArray *)array{
NSString *bigStr = array[0];
NSString *bigTemp =nil;
for (int i = 0; i<bigStr.length; i++) {
bigTemp = [bigStr substringWithRange:NSMakeRange(i,1)];
switch (bigStr.length) {
case 5:
{
if (i==0) {
[_firlArray addObject:MONEYVOICR_DIC[bigTemp]];
[_firlArray addObject:MONEYVOICR_DIC[@"10000"]];
}else if (i==1){
if ([bigTemp intValue]>0) {
[_firlArray addObject:MONEYVOICR_DIC[bigTemp]];
[_firlArray addObject:MONEYVOICR_DIC[@"1000"]];
}

}else if (i==2){
if ([bigTemp intValue]>0){
[_firlArray addObject:MONEYVOICR_DIC[bigTemp]];
[_firlArray addObject:MONEYVOICR_DIC[@"100"]];
}
}else if (i==3){
if ([bigTemp intValue]>0){
[_firlArray addObject:MONEYVOICR_DIC[bigTemp]];
[_firlArray addObject:MONEYVOICR_DIC[@"10"]];
}
}else if (i==4){
if ([bigTemp intValue]>0){
[_firlArray addObject:MONEYVOICR_DIC[bigTemp]];
}
}
}
break;
case 4:
{
if (i==0){
[_firlArray addObject:MONEYVOICR_DIC[bigTemp]];
[_firlArray addObject:MONEYVOICR_DIC[@"1000"]];

}else if (i==1){
if ([bigTemp intValue]>0){
[_firlArray addObject:MONEYVOICR_DIC[bigTemp]];
[_firlArray addObject:MONEYVOICR_DIC[@"100"]];
}
}else if (i==2){
if ([bigTemp intValue]>0){
[_firlArray addObject:MONEYVOICR_DIC[bigTemp]];
[_firlArray addObject:MONEYVOICR_DIC[@"10"]];
}
}else if (i==3){
if ([bigTemp intValue]>0){
[_firlArray addObject:MONEYVOICR_DIC[bigTemp]];
}
}
}
break;
case 3:
{
if (i==0){
[_firlArray addObject:MONEYVOICR_DIC[bigTemp]];
[_firlArray addObject:MONEYVOICR_DIC[@"100"]];
}else if (i==1){
if ([bigTemp intValue]>0){
[_firlArray addObject:MONEYVOICR_DIC[bigTemp]];
[_firlArray addObject:MONEYVOICR_DIC[@"10"]];
}
}else if (i==2){
if ([bigTemp intValue]>0){
[_firlArray addObject:MONEYVOICR_DIC[bigTemp]];
}
}
}
break;
case 2:
{
if (i==0){
[_firlArray addObject:MONEYVOICR_DIC[bigTemp]];
[_firlArray addObject:MONEYVOICR_DIC[@"10"]];
}else if (i==1){
if ([bigTemp intValue]>0){
[_firlArray addObject:MONEYVOICR_DIC[bigTemp]];
}
}
}
break;
case 1:
{
[_firlArray addObject:MONEYVOICR_DIC[bigTemp]];
}
break;

default:
break;
}
}

if(array.count<=1){
[_firlArray addObject:MONEYVOICR_DIC[@"yuan"]];
}
}
@end

还有这个语音播报不仅支持后台播报,即使在后台杀死了程序还是会播报的,不过只支持iOS10以上,请酌情结合使用。

3.后台唤醒蓝牙打印

如何实现蓝牙打印这里就不讲了,百度一下你就知道。这里只说一下后台蓝牙打印如何实现。

其实很简单,第一步的APP设置勾选上然后在此方法里面调起蓝牙打印就可以了

1
2
3
4
5
6
7
8
9

/程序在后台收到推送会调起这个方法
- (void)application:(UIApplication *)application didReceiveRemoteNotification:(NSDictionary *)userInfo fetchCompletionHandler:(void (^)(UIBackgroundFetchResult))completionHandler{

[self processRemoteNotification:userInfo isDidReceive:NO completion:^{
[JPUSHService handleRemoteNotification:userInfo];
completionHandler(UIBackgroundFetchResultNewData);
}];
}

这里提一点,也是困惑了我很久的问题,此方法调起给我们的时间是有限制的,大概是10秒。completionHandler(UIBackgroundFetchResultNewData);这句代码是我们告诉系统,我们的事情做完了,可以把后台给的资源回收回去了,所以这一代码要在我们的事件处理完了再去运行。

我之前就是直接写在下面了,所以一直无法进行网络回调,卡住了好几天,😔。

这里可以写一个block的回调,请网络请求完成或者事件处理完成,调用这个block,再去运行这句代码completionHandler(UIBackgroundFetchResultNewData);就可以了。

过程大概就是以上这么多。