目前在做智能家居相关产品,移动端采用APICloud+android 原生模块开发。
下面我来说说为什么要做这个自定义的屏幕反馈、即时音效。首先原生的提示音被各大厂商封装又封装,导致开发者拿不到手机原生的屏幕反馈的提示声,更多的还是通知栏的声音,并且就算拿得到,适配也会更让人头疼的。还有在播放音频文件时。文件的格式还有学问:(我这里用的是wav格式,我也忘记在哪看到一篇博客,说是最好用wav的,解决了适配问题,目前我还没测试。。。)
音频格式比较:
音乐爱好者常见的音频格式有:flac、ape、wav、mp3、aac、ogg、wma
1.压缩比比较:
aac>ogg>mp3(wma)>ape>flac>wav(同一音源条件下)
mp3和wma以192kbps为分界线,192kbps以上mp3好,192kbps以下wma好。
2.音质比较:
wav=flac=ape>aac>ogg>mp3>wma
3.硬件支持比较:
MP3播放器:mp3>wma>wav>flac>ape aac ogg
手机:mp3>wma>aac wav>flac ogg>ape
4.综合性能(就是综合音质体积编码率):aac>ogg>flac ape>mp3>wav、wma
思路是:由于我是在自定义view中添加该功能。所以我在onTouchEvent的move里面实现了接口事件,接口回调监听中播放提示音,这样就能做到手指滑动就触发回调,触发屏幕反馈的提示音。
下面直接粘代码:
package com.face.zophar.modulenext.utils; import android.content.Context; import android.media.AudioManager; import android.media.SoundPool; import com.face.zophar.modulenext.R; /** * 自定义播放 屏幕监听 */ public class SoundPlayUtils { // SoundPool对象 public SoundPool mSoundPlayer = new SoundPool(1, AudioManager.STREAM_SYSTEM, 1); public SoundPlayUtils soundPlayUtils; // 上下文 public Context mContext; public int mCSoundId; /** * 初始化 * * @param context */ public SoundPlayUtils init(Context context) { if (soundPlayUtils == null) { soundPlayUtils = new SoundPlayUtils(); } // 初始化声音 mContext = context; mCSoundId = mSoundPlayer.load(mContext, R.raw.note7_b, 1); return soundPlayUtils; } /** * 播放声音 * * @param */ public void play() { mSoundPlayer.play(mCSoundId, 1, 1, 0, 0, 1); } }
拓展:市面上有很多吉他、钢琴的软件 其实用SoundPool完全可以实现。
思路:可以定义6个view来代替吉他的六条琴弦,再定义一个bitmap(key:viewId,value:音频文件、开始时间、结束时间等(可以写成modle))。for循环添加到List中,结束时,就可以直接遍历这个List达到录制、播放的效果。
下面直接粘代码:
public class MainActivity extends Activity { // Helpful Constants private final int NR_OF_SIMULTANEOUS_SOUNDS = 7; private final float LEFT_VOLUME = 1.0f; private final float RIGHT_VOLUME = 1.0f; private final int NO_LOOP = 0; private final int PRIORITY = 0; private final float NORMAL_PLAY_RATE = 1.0f; private final static String TAG = "xylo"; ListkeyTones = new ArrayList<>(); boolean recordingFlag = false; @TargetApi(Build.VERSION_CODES.LOLLIPOP) @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_main); // TODO: Create a new SoundPool final SoundPool mSoundPool = (new SoundPool.Builder()).build(); Log.d("Xylo", "" + mSoundPool); // TODO: Load and get the IDs to identify the sounds int mCSoundId = mSoundPool.load(getApplicationContext(), R.raw.note1_c, 1); int mDSoundId = mSoundPool.load(getApplicationContext(), R.raw.note2_d, 1); int mESoundId = mSoundPool.load(getApplicationContext(), R.raw.note3_e, 1); int mFSoundId = mSoundPool.load(getApplicationContext(), R.raw.note4_f, 1); int mGSoundId = mSoundPool.load(getApplicationContext(), R.raw.note5_g, 1); int mASoundId = mSoundPool.load(getApplicationContext(), R.raw.note6_a, 1); int mBSoundId = mSoundPool.load(getApplicationContext(), R.raw.note7_b, 1); final SparseIntArray keyNoteMap = new SparseIntArray(); keyNoteMap.put(R.id.c_key, mCSoundId); keyNoteMap.put(R.id.d_key, mDSoundId); keyNoteMap.put(R.id.e_key, mESoundId); keyNoteMap.put(R.id.f_key, mFSoundId); keyNoteMap.put(R.id.g_key, mGSoundId); keyNoteMap.put(R.id.a_key, mASoundId); keyNoteMap.put(R.id.b_key, mBSoundId); for (int i = 0; i < keyNoteMap.size(); i++) { int buttonId = keyNoteMap.keyAt(i); int soundId = keyNoteMap.get(buttonId); findViewById(buttonId).setOnClickListener(view -> { if (recordingFlag) { Log.d("Xylophone", "Recording Flag on"); Log.d("Xylophone", "Adding Id " + view.getId()); // buttonsClicked.add(view.getId()); long currentTime = new Date().getTime(); keyTones.add(new KeyTone(buttonId, currentTime, -100000L)); int keyTonesSize = keyTones.size(); if (keyTonesSize >= 2) { keyTones.get(keyTonesSize - 2).setEndTime(currentTime); } } mSoundPool.play(soundId, LEFT_VOLUME, RIGHT_VOLUME, PRIORITY, NO_LOOP, NORMAL_PLAY_RATE); }); } Button playButton = (Button) findViewById(R.id.button_play); Button recordStopButton = (Button) findViewById(R.id.button_record_stop); playButton.setOnClickListener(View -> { Log.d(TAG, "Playbutton clicked "); for (int i = 0; i < keyTones.size(); i++) { KeyTone currentTone = keyTones.get(i); Button button = (Button) findViewById(currentTone.getButtonId()); button.performClick(); long endTime = currentTone.getEndTime(); long startTime = currentTone.getStartTime(); if (endTime > 0) { try { Thread.sleep(endTime - startTime); } catch (InterruptedException e) { e.printStackTrace(); } } } }); recordStopButton.setOnClickListener((View) -> { Log.d(TAG, "record stop clicked "); if (recordingFlag) { recordStopButton.setText(R.string.record); recordStopButton.setBackgroundColor(getResources().getColor(R.color.green)); recordingFlag = false; // Log.d(TAG, "Buttons clicked: " + buttonsClicked); Log.d(TAG, "Keytones: " + keyTones); } else { recordStopButton.setText(R.string.stop); recordStopButton.setBackgroundColor(getResources().getColor(R.color.red)); recordingFlag = true; // buttonsClicked = new ArrayList<>(); keyTones = new ArrayList<>(); } }); } }