针对TextRenderer.MeasureText和Graphics.MeasureString大小不匹配这个问题,本篇文章进行了详细的解答,同时本文还将给你拓展androidcamera的Surf
针对TextRenderer.MeasureText和Graphics.MeasureString大小不匹配这个问题,本篇文章进行了详细的解答,同时本文还将给你拓展android camera的SurfaceTexture - SurfaceTexture作为preview流的消费者,代码位置在哪里.、android – getmeasuredheight()和getmeasuredwidth()在View.measure()之后返回0)、Android 画图:.measureText() 与 .getTextBounds()、android.graphics.SurfaceTexture的实例源码等相关知识,希望可以帮助到你。
本文目录一览:- TextRenderer.MeasureText和Graphics.MeasureString大小不匹配
- android camera的SurfaceTexture - SurfaceTexture作为preview流的消费者,代码位置在哪里.
- android – getmeasuredheight()和getmeasuredwidth()在View.measure()之后返回0)
- Android 画图:.measureText() 与 .getTextBounds()
- android.graphics.SurfaceTexture的实例源码
TextRenderer.MeasureText和Graphics.MeasureString大小不匹配
这不是一个四舍五入的问题。差异〜5+像素。
测试用例字符串:“” MACD(26,12,9)-0.000016“
e.Graphics.MeasureString("MACD (26,12,9) -0.000016", SystemFonts.DefaultFont).Width)TextRenderer.MeasureText("MACD (26,12,9) -0.000016", SystemFonts.DefaultFont).Width)
结果始终是:
139.3942134
为什么尺寸有这么大的差异?我只需要外部绘画方法的字符串宽度的一轮。但是它应该匹配MeasureString,反之亦然。
答案1
小编典典TextRenderer
使用GDI呈现文本,而Graphics
使用GDI +。两者使用略微不同的方法来布置文本,因此大小不同。
您应该使用哪一个取决于最终将用于实际绘制文本的内容。如果要使用GDI
+进行绘制Graphics.DrawString
,请使用进行测量Graphics.MeasureString
。如果使用GDI进行绘图,请使用进行TextRenderer.DrawText
测量TextRenderer.MeasureText
。
如果文本将显示在Windows Forms控件内,则使用TextRenderer
ifUseCompatibleTextRendering
设置为false
(默认)。
在问题的各行之间阅读,您似乎正在使用,TextRenderer
因为事件Graphics
外没有实例Paint
。如果是这样,您可以自己创建一个进行测量:
using( Graphics g = someControl.CreateGraphics() ){ SizeF size = g.MeasureString("some text", SystemFonts.DefaultFont);}
如果您无权创建图形实例的控件,则可以使用该控件为屏幕创建一个实例,该控件可以很好地用于测量目的。
using( Graphics g = Graphics.FromHwnd(IntPtr.Zero) ){ SizeF size = g.MeasureString("some text", SystemFonts.DefaultFont);}
android camera的SurfaceTexture - SurfaceTexture作为preview流的消费者,代码位置在哪里.
环境: 三星Exynos8890平台. 安卓6.0系统
功能: camera2 的preview功能.
问题:目前想测试 preview的显示延时, 但是在HAL层enqueue buffer后, frame是先交给 camera 的 SurfaceTexture去先做处理,然后再交给SurfaceFlinger去合成再显示. 目前我想知道SurfaceTexture具体处理流 程是怎么样的(Framework->HAL->Kernel), 它是怎么把处理好的frame buffer, 之后enqueue给 SurfaceFlinger去合成再显示(我们想在它enqueue buffer处加上 timestamp).
哪位大神可以给小弟解决一下,万分感谢.
android – getmeasuredheight()和getmeasuredwidth()在View.measure()之后返回0)
layout_view.xml,用于创建视图的布局
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android" android:layout_width="100dp" android:layout_height="100dp"> </FrameLayout>
测量尺寸的功能
public void measureView(Context context){ LayoutInflater inflater = (LayoutInflater) context.getSystemService(Context.LAYOUT_INFLATER_SERVICE); View view = inflater.inflate(R.layout.layout_view,null,false); view.measure( View.MeasureSpec.UNSPECIFIED,View.MeasureSpec.UNSPECIFIED); Log.d(TAG,"Error width : " + view.getMeasuredWidth()); Log.d(TAG,"Error heeght : " + view.getMeasuredHeight()); }
解决方法
ViewTreeObserver vto = view.getViewTreeObserver(); vto.addOnGlobalLayoutListener(new OnGlobalLayoutListener() { @Override public void onGlobalLayout() { int viewWidth = view.getMeasuredWidth(); // handle viewWidth here... if (Build.VERSION.SDK_INT<16) { v.getViewTreeObserver().removeGlobalOnLayoutListener(this); } else { v.getViewTreeObserver().removeOnGlobalLayoutListener(this); } });
注意:删除侦听器以获得更好的性能;
不要调用vto来删除,调用如下:
v.getViewTreeObserver()
Android 画图:.measureText() 与 .getTextBounds()
我正在使用
测量文本Paint.getTextBounds()
,因为我对获取要呈现的文本的高度和宽度感兴趣。但是,实际呈现的文本总是比.填充.width()
的信息宽一点。Rect``getTextBounds()
令我惊讶的是,我测试.measureText()
了 ,发现它返回了不同的(更高的)值。我试了一下,发现是对的。
为什么他们报告不同的宽度?如何正确获取高度和宽度?我的意思是,我 可以
使用.measureText()
,但是我不知道我是否应该信任.height()
返回的getTextBounds()
.
根据要求,这里是重现问题的最少代码:
final String someText = "Hello. I believe I''m some text!";Paint p = new Paint();Rect bounds = new Rect();for (float f = 10; f < 40; f += 1f) { p.setTextSize(f); p.getTextBounds(someText, 0, someText.length(), bounds); Log.d("Test", String.format( "Size %f, measureText %f, getTextBounds %d", f, p.measureText(someText), bounds.width()) );}
输出显示差异不仅大于 1(并且没有最后一分钟的舍入误差),而且似乎随着大小而增加(我正要得出更多结论,但它可能完全取决于字体):
D/Test ( 607): Size 10.000000, measureText 135.000000, getTextBounds 134D/Test ( 607): Size 11.000000, measureText 149.000000, getTextBounds 148D/Test ( 607): Size 12.000000, measureText 156.000000, getTextBounds 155D/Test ( 607): Size 13.000000, measureText 171.000000, getTextBounds 169D/Test ( 607): Size 14.000000, measureText 195.000000, getTextBounds 193D/Test ( 607): Size 15.000000, measureText 201.000000, getTextBounds 199D/Test ( 607): Size 16.000000, measureText 211.000000, getTextBounds 210D/Test ( 607): Size 17.000000, measureText 225.000000, getTextBounds 223D/Test ( 607): Size 18.000000, measureText 245.000000, getTextBounds 243D/Test ( 607): Size 19.000000, measureText 251.000000, getTextBounds 249D/Test ( 607): Size 20.000000, measureText 269.000000, getTextBounds 267D/Test ( 607): Size 21.000000, measureText 275.000000, getTextBounds 272D/Test ( 607): Size 22.000000, measureText 297.000000, getTextBounds 294D/Test ( 607): Size 23.000000, measureText 305.000000, getTextBounds 302D/Test ( 607): Size 24.000000, measureText 319.000000, getTextBounds 316D/Test ( 607): Size 25.000000, measureText 330.000000, getTextBounds 326D/Test ( 607): Size 26.000000, measureText 349.000000, getTextBounds 346D/Test ( 607): Size 27.000000, measureText 357.000000, getTextBounds 354D/Test ( 607): Size 28.000000, measureText 369.000000, getTextBounds 365D/Test ( 607): Size 29.000000, measureText 396.000000, getTextBounds 392D/Test ( 607): Size 30.000000, measureText 401.000000, getTextBounds 397D/Test ( 607): Size 31.000000, measureText 418.000000, getTextBounds 414D/Test ( 607): Size 32.000000, measureText 423.000000, getTextBounds 418D/Test ( 607): Size 33.000000, measureText 446.000000, getTextBounds 441D/Test ( 607): Size 34.000000, measureText 455.000000, getTextBounds 450D/Test ( 607): Size 35.000000, measureText 468.000000, getTextBounds 463D/Test ( 607): Size 36.000000, measureText 474.000000, getTextBounds 469D/Test ( 607): Size 37.000000, measureText 500.000000, getTextBounds 495D/Test ( 607): Size 38.000000, measureText 506.000000, getTextBounds 501D/Test ( 607): Size 39.000000, measureText 521.000000, getTextBounds 515
答案1
小编典典你可以做我所做的来检查这样的问题:
研究 Android 源代码,Paint.java 源,同时查看 measureText 和 getTextBounds 方法。您将了解到
measureText 调用 native_measureText,getTextBounds 调用 nativeGetStringBounds,它们是用
C++ 实现的本地方法。
因此,您将继续研究 Paint.cpp,它实现了两者。
native_measureText -> SkPaintGlue::measureText_CII
nativeGetStringBounds -> SkPaintGlue::getStringBounds
现在,您的研究检查了这些方法的不同之处。经过一些参数检查后,两者都调用了 Skia Lib(Android 的一部分)中的函数
SkPaint::measureText,但它们都调用了不同的重载形式。
深入研究 Skia,我发现这两个调用都在同一个函数中产生了相同的计算,只是返回的结果不同。
回答您的问题: 您的两个调用都进行相同的计算。结果的可能差异实际上在于 getTextBounds 以整数返回边界,而
measureText 返回浮点值。
因此,您得到的是在将 float 转换为 int 期间出现舍入错误,这发生在 SkPaintGlue::doTextBounds 中的 Paint.cpp
中调用函数 SkRect::roundOut。
这两个调用的计算宽度之差可能最大为 1。
编辑 2011 年 10 月 4 日
什么可能比可视化更好。我付出了努力,为了自己的探索,为了值得赏金:)
这是字体大小 60,红色是 边界 矩形,紫色是 measureText 的结果。
可以看出 bounds left 部分从左侧开始一些像素,并且 measureText 的值在左侧和右侧都增加了该值。这就是所谓的 Glyph 的
AdvanceX 值。(我在 SkPaint.cpp 的 Skia 资源中发现了这一点)
所以测试的结果是 measureText 为两边的文本添加了一些高级值,而 getTextBounds 计算给定文本适合的最小边界。
希望这个结果对你有用。
测试代码:
protected void onDraw(Canvas canvas){ final String s = "Hello. I''m some text!"; Paint p = new Paint(); Rect bounds = new Rect(); p.setTextSize(60); p.getTextBounds(s, 0, s.length(), bounds); float mt = p.measureText(s); int bw = bounds.width(); Log.i("LCG", String.format( "measureText %f, getTextBounds %d (%s)", mt, bw, bounds.toShortString()) ); bounds.offset(0, -bounds.top); p.setStyle(Style.STROKE); canvas.drawColor(0xff000080); p.setColor(0xffff0000); canvas.drawRect(bounds, p); p.setColor(0xff00ff00); canvas.drawText(s, 0, bounds.bottom, p); }
android.graphics.SurfaceTexture的实例源码
@Override public void onSurfaceTextureAvailable(SurfaceTexture surfaceTexture,int i,int i1) { Log.i(TAG,"onSurfaceTextureAvailable [" + this.hashCode() + "] "); if (savedSurfaceTexture == null) { savedSurfaceTexture = surfaceTexture; prepare(); } else { textureView.setSurfaceTexture(savedSurfaceTexture); } }
public void open() { try { CameraManager manager = (CameraManager) mActivity.getSystemService(Context.CAMERA_SERVICE); for (String cameraId : manager.getCameraIdList()) { Cameracharacteristics characteristics = manager.getCameracharacteristics(cameraId); if (characteristics.get(Cameracharacteristics.LENS_FACING) == Cameracharacteristics.LENS_FACING_BACK) { StreamConfigurationMap map = characteristics.get(Cameracharacteristics.SCALER_STREAM_CONfigURATION_MAP); mCameraSize = map.getoutputSizes(SurfaceTexture.class)[0]; HandlerThread thread = new HandlerThread("OpenCamera"); thread.start(); Handler backgroundHandler = new Handler(thread.getLooper()); manager.openCamera(cameraId,mCameraDeviceCallback,null); // カメラの物理的な情報を得る mCameracharacteristics = manager.getCameracharacteristics( cameraId ); return; } } } catch (CameraAccessException e) { e.printstacktrace(); } }
@Override public void onopened(CameraDevice camera) { checkIsOnCameraThread(); Logging.d(TAG,"Camera opened."); cameraDevice = camera; final SurfaceTexture surfaceTexture = surfaceTextureHelper.getSurfaceTexture(); surfaceTexture.setDefaultBufferSize(captureFormat.width,captureFormat.height); surface = new Surface(surfaceTexture); List<Surface> surfaces = new ArrayList<Surface>(); surfaces.add(surface); if (mediaRecorderSurface != null) { Logging.d(TAG,"Add MediaRecorder surface to capture session."); surfaces.add(mediaRecorderSurface); } try { camera.createCaptureSession(surfaces,new CaptureSessionCallback(),cameraThreadHandler); } catch (CameraAccessException e) { reportError("Failed to create capture session. " + e); return; } }
@Override public Point open(SurfaceTexture surface) { try { if(!extractMedia()){ return new Point(0,0); } mFrameSem=new Semaphore(0); mDecodeSem=new Semaphore(1); videoProvideEndFlag=false; isUserWantToStop=false; mAudioEncodeTrack=mStore.addTrack(mExtractor.getTrackFormat(mAudioDecodeTrack)); MediaFormat format=mExtractor.getTrackFormat(mVideoDecodeTrack); mVideoDecoder = MediaCodec.createDecoderByType(format.getString(MediaFormat.KEY_MIME)); mVideoDecoder.configure(format,new Surface(surface),null,0); mVideoDecoder.start(); startDecodeThread(); } catch (IOException e) { e.printstacktrace(); } return mVideoSize; }
@TargetApi(Build.VERSION_CODES.JELLY_BEAN) public void bindToMediaPlayer(IMediaPlayer mp) { if (mp == null) return; if ((Build.VERSION.SDK_INT >= Build.VERSION_CODES.JELLY_BEAN) && (mp instanceof ISurfaceTextureHolder)) { ISurfaceTextureHolder textureHolder = (ISurfaceTextureHolder) mp; mTextureView.mSurfaceCallback.setownSurfaceTexture(false); SurfaceTexture surfaceTexture = textureHolder.getSurfaceTexture(); if (surfaceTexture != null) { mTextureView.setSurfaceTexture(surfaceTexture); } else { textureHolder.setSurfaceTexture(mSurfaceTexture); textureHolder.setSurfaceTextureHost(mTextureView.mSurfaceCallback); } } else { mp.setSurface(openSurface()); } }
@TargetApi(Build.VERSION_CODES.JELLY_BEAN) public void bindToMediaPlayer(IMediaPlayer mp) { if (mp == null) return; if ((Build.VERSION.SDK_INT >= Build.VERSION_CODES.JELLY_BEAN) && (mp instanceof ISurfaceTextureHolder)) { ISurfaceTextureHolder textureHolder = (ISurfaceTextureHolder) mp; mTextureView.mSurfaceCallback.setownSurfaceTexture(false); SurfaceTexture surfaceTexture = textureHolder.getSurfaceTexture(); if (surfaceTexture != null) { mTextureView.setSurfaceTexture(surfaceTexture); } else { textureHolder.setSurfaceTexture(mSurfaceTexture); textureHolder.setSurfaceTextureHost(mTextureView.mSurfaceCallback); } } else { mp.setSurface(openSurface()); } }
EglSurface(final EGLEnvironment egl,final Object surface) { if (!(surface instanceof SurfaceView) && !(surface instanceof Surface) && !(surface instanceof SurfaceHolder) && !(surface instanceof SurfaceTexture)) throw new IllegalArgumentException("unsupported surface"); mEgl = egl; mEglSurface = mEgl.createWindowSurface(surface); mWidth = mEgl.querySurface(mEglSurface,EGL14.EGL_WIDTH); mHeight = mEgl.querySurface(mEglSurface,EGL14.EGL_HEIGHT); }
@Override public void onSurfaceTextureAvailable(SurfaceTexture surface,int width,int height) { mSurfaceTexture = surface; mIsFormatChanged = false; mWidth = 0; mHeight = 0; ISurfaceHolder surfaceHolder = new InternalSurfaceHolder(mWeakRenderView.get(),surface); for (IRenderCallback renderCallback : mRenderCallbackMap.keySet()) { renderCallback.onSurfaceCreated(surfaceHolder,0); } }
@Override public boolean onSurfaceTextureDestroyed(SurfaceTexture texture) { synchronized (mCameraStateLock) { mPreviewSize = null; } return true; }
private void initSurfaceTexture() { Log.d(LOGTAG,"initSurfaceTexture"); deleteSurfaceTexture(); initTexOES(texCamera); mSTexture = new SurfaceTexture(texCamera[0]); mSTexture.setonFrameAvailableListener(this); }
public InternalSurfaceHolder(@NonNull TextureRenderView textureView,@Nullable SurfaceTexture surfaceTexture,@NonNull ISurfaceTextureHost surfaceTextureHost) { mTextureView = textureView; mSurfaceTexture = surfaceTexture; mSurfaceTextureHost = surfaceTextureHost; }
private void initSurfaceTexture() { Log.d(LOGTAG,"initSurfaceTexture"); deleteSurfaceTexture(); initTexOES(texCamera); mSTexture = new SurfaceTexture(texCamera[0]); mSTexture.setonFrameAvailableListener(this); }
/** * Sets the {@link TextureView} onto which video will be rendered. The player will track the * lifecycle of the surface automatically. * * @param textureView The texture view. */ public void setVideoTextureView(TextureView textureView) { removeSurfaceCallbacks(); this.textureView = textureView; if (textureView == null) { setVideoSurfaceInternal(null,true); } else { if (textureView.getSurfaceTextureListener() != null) { Log.w(TAG,"Replacing existing SurfaceTextureListener."); } SurfaceTexture surfaceTexture = textureView.getSurfaceTexture(); setVideoSurfaceInternal(surfaceTexture == null ? null : new Surface(surfaceTexture),true); textureView.setSurfaceTextureListener(componentListener); } }
@Override public void onCameraSurfaceDestroy(SurfaceTexture surfaceTexture) { Log.d(TAG,"onCameraSurfaceDestroy"); isSurfaceReady = false; mCamera.stopPreview(); mCamera.release(); if (mVideoRecorder.isRecording()) { mVideoRecorder.stop(); } }
private void initSurfaceTexture() { Log.d(LOGTAG,"initSurfaceTexture"); deleteSurfaceTexture(); initTexOES(texCamera); mSTexture = new SurfaceTexture(texCamera[0]); mSTexture.setonFrameAvailableListener(this); }
@Override public void releaseSurfaceTexture(SurfaceTexture surfaceTexture) { if (surfaceTexture == null) { Log.d(TAG,"releaseSurfaceTexture: null"); } else if (mDidDetachFromWindow) { if (surfaceTexture != mSurfaceTexture) { Log.d(TAG,"releaseSurfaceTexture: didDetachFromWindow(): release different SurfaceTexture"); surfaceTexture.release(); } else if (!mOwnSurfaceTexture) { Log.d(TAG,"releaseSurfaceTexture: didDetachFromWindow(): release detached SurfaceTexture"); surfaceTexture.release(); } else { Log.d(TAG,"releaseSurfaceTexture: didDetachFromWindow(): already released by TextureView"); } } else if (mWillDetachFromWindow) { if (surfaceTexture != mSurfaceTexture) { Log.d(TAG,"releaseSurfaceTexture: willDetachFromWindow(): release different SurfaceTexture"); surfaceTexture.release(); } else if (!mOwnSurfaceTexture) { Log.d(TAG,"releaseSurfaceTexture: willDetachFromWindow(): re-attach SurfaceTexture to TextureView"); setownSurfaceTexture(true); } else { Log.d(TAG,"releaseSurfaceTexture: willDetachFromWindow(): will released by TextureView"); } } else { if (surfaceTexture != mSurfaceTexture) { Log.d(TAG,"releaseSurfaceTexture: alive: release different SurfaceTexture"); surfaceTexture.release(); } else if (!mOwnSurfaceTexture) { Log.d(TAG,"releaseSurfaceTexture: alive: re-attach SurfaceTexture to TextureView"); setownSurfaceTexture(true); } else { Log.d(TAG,"releaseSurfaceTexture: alive: will released by TextureView"); } } }
public static void startPreview(SurfaceTexture surfaceTexture){ try { mCamera.setPreviewTexture(surfaceTexture); mCamera.startPreview(); } catch (IOException e) { // Todo Auto-generated catch block e.printstacktrace(); } }
private void initSurfaceTexture() { Log.d(LOGTAG,"initSurfaceTexture"); deleteSurfaceTexture(); initTexOES(texCamera); mSTexture = new SurfaceTexture(texCamera[0]); mSTexture.setonFrameAvailableListener(this); }
@Override public void onSurfaceCreated(GL10 gl,EGLConfig config) { GLES20.glClearColor(0.0f,1.0f,0.0f,0.0f); GLES20.gldisable(GLES20.GL_DEPTH_TEST); mTexture = Openglutils.genOesTexture(); mGLImageView.setimageTexture(mTexture); mSurfaceTexture = new SurfaceTexture(mTexture); mSurfaceTexture.setonFrameAvailableListener(this); mVideoPlayer.setoutSurface(new Surface(mSurfaceTexture)); }
@Override public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) { if (videoPlayer == null) { return true; } videoPlayer.setdisplay(null); return true; }
@Override public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) { mSurfaceTexture = surface; mIsFormatChanged = false; mWidth = 0; mHeight = 0; ISurfaceHolder surfaceHolder = new InternalSurfaceHolder(mWeakRenderView.get(),surface,this); for (IRenderCallback renderCallback : mRenderCallbackMap.keySet()) { renderCallback.onSurfaceDestroyed(surfaceHolder); } Log.d(TAG,"onSurfaceTextureDestroyed: destroy: " + mOwnSurfaceTexture); return mOwnSurfaceTexture; }
@Override public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) { mSurfaceTexture = surface; mIsFormatChanged = false; mWidth = 0; mHeight = 0; ISurfaceHolder surfaceHolder = new InternalSurfaceHolder(mWeakRenderView.get(),"onSurfaceTextureDestroyed: destroy: " + mOwnSurfaceTexture); return mOwnSurfaceTexture; }
@Override public void onSurfaceTextureSizeChanged(SurfaceTexture surface,int height) { mSurfaceTexture = surface; mIsFormatChanged = true; mWidth = width; mHeight = height; ISurfaceHolder surfaceHolder = new InternalSurfaceHolder(mWeakRenderView.get(),this); for (IRenderCallback renderCallback : mRenderCallbackMap.keySet()) { renderCallback.onSurfaceChanged(surfaceHolder,width,height); } }
public void onSurfaceTextureAvailable(SurfaceTexture surface,int height) { this.mSurfaceTexture = surface; this.mIsFormatChanged = false; this.mWidth = 0; this.mHeight = 0; ISurfaceHolder surfaceHolder = new InternalSurfaceHolder((TextureRenderView) this .mWeakRenderView.get(),this); for (IRenderCallback renderCallback : this.mRenderCallbackMap.keySet()) { renderCallback.onSurfaceCreated(surfaceHolder,0); } }
private void initSurfaceTexture() { Log.d(LOGTAG,"initSurfaceTexture"); deleteSurfaceTexture(); initTexOES(texCamera); mSTexture = new SurfaceTexture(texCamera[0]); mSTexture.setonFrameAvailableListener(this); }
/** * Opens the camera and starts sending preview frames to the underlying detector. The preview * frames are not displayed. * * @throws IOException if the camera's preview texture or display Could not be initialized */ @RequiresPermission(Manifest.permission.CAMERA) public CameraSource start() throws IOException { synchronized (mCameraLock) { if (mCamera != null) { return this; } mCamera = createCamera(); // SurfaceTexture was introduced in Honeycomb (11),so if we are running and // old version of Android. fall back to use SurfaceView. if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.HONEYCOMB) { mDummySurfaceTexture = new SurfaceTexture(DUMMY_TEXTURE_NAME); mCamera.setPreviewTexture(mDummySurfaceTexture); } else { mDummySurfaceView = new SurfaceView(mContext); mCamera.setPreviewdisplay(mDummySurfaceView.getHolder()); } mCamera.startPreview(); mProcessingThread = new Thread(mFrameProcessor); mFrameProcessor.setActive(true); mProcessingThread.start(); } return this; }
public void onSurfaceTextureAvailable(SurfaceTexture surface,int height) { Log.d(TAG,"onSurfaceTextureAvailable " + surface + " " + width + " " + height); initHandlerThread(); Message msg = Message.obtain(mGLHandler,MSG_TYPE_SURFACE_CREATED,surface); mGLHandler.sendMessage(msg); msg = Message.obtain(mGLHandler,MSG_TYPE_SURFACE_CHANGED,height); mGLHandler.sendMessage(msg); }
@Override public void onSurfaceTextureAvailable(SurfaceTexture surface,int height) { mSurface = new Surface(surface); if(mTargetState == STATE_PLAYING) { if(SHOW_LOGS) Log.i(TAG,"onSurfaceTextureAvailable start"); start(); } }
/** * 处理显示逻辑 */ @Override public void onSurfaceTextureAvailable(SurfaceTexture surface,int height) { super.onSurfaceTextureAvailable(surface,height); resolveRotateUI(); resolveTransform(); }
private void initSurfaceTexture() { Log.d(LOGTAG,"initSurfaceTexture"); deleteSurfaceTexture(); initTexOES(texCamera); mSTexture = new SurfaceTexture(texCamera[0]); mSTexture.setonFrameAvailableListener(this); }
@Override // will be called on UI thread public boolean onSurfaceTextureDestroyed(SurfaceTexture st) { Log.d(TAG,"onSurfaceTextureDestroyed"); synchronized (mlock) { mSurfaceTexture = null; } return true; }
public Size[] getCameraResolutionsFront() { try { Cameracharacteristics cameracharacteristics = cameraManager.getCameracharacteristics("0"); if (cameracharacteristics.get(Cameracharacteristics.LENS_FACING) != Cameracharacteristics.LENS_FACING_FRONT) { cameracharacteristics = cameraManager.getCameracharacteristics("1"); } StreamConfigurationMap streamConfigurationMap = cameracharacteristics.get(Cameracharacteristics.SCALER_STREAM_CONfigURATION_MAP); return streamConfigurationMap.getoutputSizes(SurfaceTexture.class); } catch (CameraAccessException e) { Log.e(TAG,e.getMessage()); return new Size[0]; } }
@Override public void onSurfaceTextureAvailable(SurfaceTexture surface,this); for (IRenderCallback renderCallback : mRenderCallbackMap.keySet()) { renderCallback.onSurfaceCreated(surfaceHolder,0); } }
@Override public void onSurfaceTextureAvailable(SurfaceTexture surface,int height) { if (mOnEvent != null) mOnEvent.surfaceTextureAvailable(surface); mWidth = width; mHeight = height; }
public InternalSurfaceHolder(@NonNull TextureRenderView textureView,@NonNull ISurfaceTextureHost surfaceTextureHost) { mTextureView = textureView; mSurfaceTexture = surfaceTexture; mSurfaceTextureHost = surfaceTextureHost; }
@Override public void onFrameAvailable(SurfaceTexture st) { synchronized (mFrameSyncObject) { if (mFrameAvailable) { throw new RuntimeException("mFrameAvailable already set,frame Could be dropped"); } mFrameAvailable = true; mFrameSyncObject.notifyAll(); } }
@Override public void setSurfaceTexture(SurfaceTexture surfaceTexture) { if (mSurfaceTexture == surfaceTexture) return; releaseSurfaceTexture(); mSurfaceTexture = surfaceTexture; if (surfaceTexture == null) { super.setSurface(null); } else { super.setSurface(new Surface(surfaceTexture)); } }
private void setup() { mTextureRender = new TextureRenderer(rotateRender); mTextureRender.surfaceCreated(); mSurfaceTexture = new SurfaceTexture(mTextureRender.getTextureId()); mSurfaceTexture.setonFrameAvailableListener(this); mSurface = new Surface(mSurfaceTexture); }
@Override public boolean onSurfaceTextureDestroyed(SurfaceTexture texture) { if (mTextureView != null) { mTextureView.setSurfaceTextureListener(null); } return true; }
关于TextRenderer.MeasureText和Graphics.MeasureString大小不匹配的问题我们已经讲解完毕,感谢您的阅读,如果还想了解更多关于android camera的SurfaceTexture - SurfaceTexture作为preview流的消费者,代码位置在哪里.、android – getmeasuredheight()和getmeasuredwidth()在View.measure()之后返回0)、Android 画图:.measureText() 与 .getTextBounds()、android.graphics.SurfaceTexture的实例源码等相关内容,可以在本站寻找。
本文标签: