OpenGL과 함께 onSensorChanged 센서 데이터를 사용하는 방법
(편집 : 증강 현실 프레임 워크 에 최상의 작동 방식을 추가 했으며 이제 자이로 스코프를 고려하여 다시 훨씬 더 안정적으로 만듭니다 : DroidAR 프레임 워크 )
나는 당신이 얻는 데이터에서 회전 각도를 계산하는 방법을 찾기 위해 TestSuite를 작성했습니다 SensorEventListener.onSensorChanged()
. 나와 같은 문제를 겪을 사람들을 돕기 위해 내 솔루션을 완성 할 수 있기를 정말로 바랍니다. 여기에 코드가 있습니다. 읽어 보시면 이해할 수있을 것 같습니다.
자유롭게 변경하십시오. 주요 아이디어는 방향 각도를 OpenGL 뷰 또는 필요한 다른 대상으로 보내는 여러 메서드를 구현하는 것이 었습니다.
방법 1 ~ 4가 작동 중이며 rotationMatrix를 OpenGl보기로 직접 보냅니다.
방법 6도 지금도 작동하지만 회전이 왜 yx z를 수행해야하는지 설명이 없습니다.
다른 모든 방법이 작동하지 않거나 버그가 많으며 누군가가 작동하게 할 수 있기를 바랍니다. 가장 좋은 방법은 작동한다면 방법 5가 될 것이라고 생각합니다. 왜냐하면 그것이 이해하기 가장 쉬울 것이기 때문에 그것이 얼마나 효율적인지 잘 모르겠습니다. . 전체 코드가 최적화되지 않았으므로 프로젝트에서 그대로 사용하지 않는 것이 좋습니다.
여기있어:
/**
* This class provides a basic demonstration of how to use the
* {@link android.hardware.SensorManager SensorManager} API to draw a 3D
* compass.
*/
public class SensorToOpenGlTests extends Activity implements Renderer,
SensorEventListener {
private static final boolean TRY_TRANSPOSED_VERSION = false;
/*
* MODUS overview:
*
* 1 - unbufferd data directly transfaired from the rotation matrix to the
* modelview matrix
*
* 2 - buffered version of 1 where both acceleration and magnetometer are
* buffered
*
* 3 - buffered version of 1 where only magnetometer is buffered
*
* 4 - buffered version of 1 where only acceleration is buffered
*
* 5 - uses the orientation sensor and sets the angles how to rotate the
* camera with glrotate()
*
* 6 - uses the rotation matrix to calculate the angles
*
* 7 to 12 - every possibility how the rotationMatrix could be constructed
* in SensorManager.getRotationMatrix (see
* http://www.songho.ca/opengl/gl_anglestoaxes.html#anglestoaxes for all
* possibilities)
*/
private static int MODUS = 2;
private GLSurfaceView openglView;
private FloatBuffer vertexBuffer;
private ByteBuffer indexBuffer;
private FloatBuffer colorBuffer;
private SensorManager mSensorManager;
private float[] rotationMatrix = new float[16];
private float[] accelGData = new float[3];
private float[] bufferedAccelGData = new float[3];
private float[] magnetData = new float[3];
private float[] bufferedMagnetData = new float[3];
private float[] orientationData = new float[3];
// private float[] mI = new float[16];
private float[] resultingAngles = new float[3];
private int mCount;
final static float rad2deg = (float) (180.0f / Math.PI);
private boolean landscape;
public SensorToOpenGlTests() {
}
/** Called with the activity is first created. */
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
mSensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);
openglView = new GLSurfaceView(this);
openglView.setRenderer(this);
setContentView(openglView);
}
@Override
protected void onResume() {
// Ideally a game should implement onResume() and onPause()
// to take appropriate action when the activity looses focus
super.onResume();
openglView.onResume();
if (((WindowManager) getSystemService(WINDOW_SERVICE))
.getDefaultDisplay().getOrientation() == 1) {
landscape = true;
} else {
landscape = false;
}
mSensorManager.registerListener(this, mSensorManager
.getDefaultSensor(Sensor.TYPE_ACCELEROMETER),
SensorManager.SENSOR_DELAY_GAME);
mSensorManager.registerListener(this, mSensorManager
.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD),
SensorManager.SENSOR_DELAY_GAME);
mSensorManager.registerListener(this, mSensorManager
.getDefaultSensor(Sensor.TYPE_ORIENTATION),
SensorManager.SENSOR_DELAY_GAME);
}
@Override
protected void onPause() {
// Ideally a game should implement onResume() and onPause()
// to take appropriate action when the activity looses focus
super.onPause();
openglView.onPause();
mSensorManager.unregisterListener(this);
}
public int[] getConfigSpec() {
// We want a depth buffer, don't care about the
// details of the color buffer.
int[] configSpec = { EGL10.EGL_DEPTH_SIZE, 16, EGL10.EGL_NONE };
return configSpec;
}
public void onDrawFrame(GL10 gl) {
// clear screen and color buffer:
gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);
// set target matrix to modelview matrix:
gl.glMatrixMode(GL10.GL_MODELVIEW);
// init modelview matrix:
gl.glLoadIdentity();
// move camera away a little bit:
if ((MODUS == 1) || (MODUS == 2) || (MODUS == 3) || (MODUS == 4)) {
if (landscape) {
// in landscape mode first remap the rotationMatrix before using
// it with glMultMatrixf:
float[] result = new float[16];
SensorManager.remapCoordinateSystem(rotationMatrix,
SensorManager.AXIS_Y, SensorManager.AXIS_MINUS_X,
result);
gl.glMultMatrixf(result, 0);
} else {
gl.glMultMatrixf(rotationMatrix, 0);
}
} else {
//in all other modes do the rotation by hand
//the order y x z is important!
gl.glRotatef(resultingAngles[2], 0, 1, 0);
gl.glRotatef(resultingAngles[1], 1, 0, 0);
gl.glRotatef(resultingAngles[0], 0, 0, 1);
}
//move the axis to simulate augmented behaviour:
gl.glTranslatef(0, 2, 0);
// draw the 3 axis on the screen:
gl.glVertexPointer(3, GL_FLOAT, 0, vertexBuffer);
gl.glColorPointer(4, GL_FLOAT, 0, colorBuffer);
gl.glDrawElements(GL_LINES, 6, GL_UNSIGNED_BYTE, indexBuffer);
}
public void onSurfaceChanged(GL10 gl, int width, int height) {
gl.glViewport(0, 0, width, height);
float r = (float) width / height;
gl.glMatrixMode(GL10.GL_PROJECTION);
gl.glLoadIdentity();
gl.glFrustumf(-r, r, -1, 1, 1, 10);
}
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
gl.glDisable(GL10.GL_DITHER);
gl.glClearColor(1, 1, 1, 1);
gl.glEnable(GL10.GL_CULL_FACE);
gl.glShadeModel(GL10.GL_SMOOTH);
gl.glEnable(GL10.GL_DEPTH_TEST);
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
gl.glEnableClientState(GL10.GL_COLOR_ARRAY);
// load the 3 axis and there colors:
float vertices[] = { 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1 };
float colors[] = { 0, 0, 0, 0, 1, 0, 0, 1, 0, 1, 0, 1, 0, 0, 1, 1 };
byte indices[] = { 0, 1, 0, 2, 0, 3 };
ByteBuffer vbb;
vbb = ByteBuffer.allocateDirect(vertices.length * 4);
vbb.order(ByteOrder.nativeOrder());
vertexBuffer = vbb.asFloatBuffer();
vertexBuffer.put(vertices);
vertexBuffer.position(0);
vbb = ByteBuffer.allocateDirect(colors.length * 4);
vbb.order(ByteOrder.nativeOrder());
colorBuffer = vbb.asFloatBuffer();
colorBuffer.put(colors);
colorBuffer.position(0);
indexBuffer = ByteBuffer.allocateDirect(indices.length);
indexBuffer.put(indices);
indexBuffer.position(0);
}
public void onAccuracyChanged(Sensor sensor, int accuracy) {
}
public void onSensorChanged(SensorEvent event) {
// load the new values:
loadNewSensorData(event);
if (MODUS == 1) {
SensorManager.getRotationMatrix(rotationMatrix, null, accelGData,
magnetData);
}
if (MODUS == 2) {
rootMeanSquareBuffer(bufferedAccelGData, accelGData);
rootMeanSquareBuffer(bufferedMagnetData, magnetData);
SensorManager.getRotationMatrix(rotationMatrix, null,
bufferedAccelGData, bufferedMagnetData);
}
if (MODUS == 3) {
rootMeanSquareBuffer(bufferedMagnetData, magnetData);
SensorManager.getRotationMatrix(rotationMatrix, null, accelGData,
bufferedMagnetData);
}
if (MODUS == 4) {
rootMeanSquareBuffer(bufferedAccelGData, accelGData);
SensorManager.getRotationMatrix(rotationMatrix, null,
bufferedAccelGData, magnetData);
}
if (MODUS == 5) {
// this mode uses the sensor data recieved from the orientation
// sensor
resultingAngles = orientationData.clone();
if ((-90 > resultingAngles[1]) || (resultingAngles[1] > 90)) {
resultingAngles[1] = orientationData[0];
resultingAngles[2] = orientationData[1];
resultingAngles[0] = orientationData[2];
}
}
if (MODUS == 6) {
SensorManager.getRotationMatrix(rotationMatrix, null, accelGData,
magnetData);
final float[] anglesInRadians = new float[3];
SensorManager.getOrientation(rotationMatrix, anglesInRadians);
//TODO check for landscape mode
resultingAngles[0] = anglesInRadians[0] * rad2deg;
resultingAngles[1] = anglesInRadians[1] * rad2deg;
resultingAngles[2] = anglesInRadians[2] * -rad2deg;
}
if (MODUS == 7) {
SensorManager.getRotationMatrix(rotationMatrix, null, accelGData,
magnetData);
rotationMatrix = transpose(rotationMatrix);
/*
* this assumes that the rotation matrices are multiplied in x y z
* order Rx*Ry*Rz
*/
resultingAngles[2] = (float) (Math.asin(rotationMatrix[2]));
final float cosB = (float) Math.cos(resultingAngles[2]);
resultingAngles[2] = resultingAngles[2] * rad2deg;
resultingAngles[0] = -(float) (Math.acos(rotationMatrix[0] / cosB))
* rad2deg;
resultingAngles[1] = (float) (Math.acos(rotationMatrix[10] / cosB))
* rad2deg;
}
if (MODUS == 8) {
SensorManager.getRotationMatrix(rotationMatrix, null, accelGData,
magnetData);
rotationMatrix = transpose(rotationMatrix);
/*
* this assumes that the rotation matrices are multiplied in z y x
*/
resultingAngles[2] = (float) (Math.asin(-rotationMatrix[8]));
final float cosB = (float) Math.cos(resultingAngles[2]);
resultingAngles[2] = resultingAngles[2] * rad2deg;
resultingAngles[1] = (float) (Math.acos(rotationMatrix[9] / cosB))
* rad2deg;
resultingAngles[0] = (float) (Math.asin(rotationMatrix[4] / cosB))
* rad2deg;
}
if (MODUS == 9) {
SensorManager.getRotationMatrix(rotationMatrix, null, accelGData,
magnetData);
rotationMatrix = transpose(rotationMatrix);
/*
* this assumes that the rotation matrices are multiplied in z x y
*
* note z axis looks good at this one
*/
resultingAngles[1] = (float) (Math.asin(rotationMatrix[9]));
final float minusCosA = -(float) Math.cos(resultingAngles[1]);
resultingAngles[1] = resultingAngles[1] * rad2deg;
resultingAngles[2] = (float) (Math.asin(rotationMatrix[8]
/ minusCosA))
* rad2deg;
resultingAngles[0] = (float) (Math.asin(rotationMatrix[1]
/ minusCosA))
* rad2deg;
}
if (MODUS == 10) {
SensorManager.getRotationMatrix(rotationMatrix, null, accelGData,
magnetData);
rotationMatrix = transpose(rotationMatrix);
/*
* this assumes that the rotation matrices are multiplied in y x z
*/
resultingAngles[1] = (float) (Math.asin(-rotationMatrix[6]));
final float cosA = (float) Math.cos(resultingAngles[1]);
resultingAngles[1] = resultingAngles[1] * rad2deg;
resultingAngles[2] = (float) (Math.asin(rotationMatrix[2] / cosA))
* rad2deg;
resultingAngles[0] = (float) (Math.acos(rotationMatrix[5] / cosA))
* rad2deg;
}
if (MODUS == 11) {
SensorManager.getRotationMatrix(rotationMatrix, null, accelGData,
magnetData);
rotationMatrix = transpose(rotationMatrix);
/*
* this assumes that the rotation matrices are multiplied in y z x
*/
resultingAngles[0] = (float) (Math.asin(rotationMatrix[4]));
final float cosC = (float) Math.cos(resultingAngles[0]);
resultingAngles[0] = resultingAngles[0] * rad2deg;
resultingAngles[2] = (float) (Math.acos(rotationMatrix[0] / cosC))
* rad2deg;
resultingAngles[1] = (float) (Math.acos(rotationMatrix[5] / cosC))
* rad2deg;
}
if (MODUS == 12) {
SensorManager.getRotationMatrix(rotationMatrix, null, accelGData,
magnetData);
rotationMatrix = transpose(rotationMatrix);
/*
* this assumes that the rotation matrices are multiplied in x z y
*/
resultingAngles[0] = (float) (Math.asin(-rotationMatrix[1]));
final float cosC = (float) Math.cos(resultingAngles[0]);
resultingAngles[0] = resultingAngles[0] * rad2deg;
resultingAngles[2] = (float) (Math.acos(rotationMatrix[0] / cosC))
* rad2deg;
resultingAngles[1] = (float) (Math.acos(rotationMatrix[5] / cosC))
* rad2deg;
}
logOutput();
}
/**
* transposes the matrix because it was transposted (inverted, but here its
* the same, because its a rotation matrix) to be used for opengl
*
* @param source
* @return
*/
private float[] transpose(float[] source) {
final float[] result = source.clone();
if (TRY_TRANSPOSED_VERSION) {
result[1] = source[4];
result[2] = source[8];
result[4] = source[1];
result[6] = source[9];
result[8] = source[2];
result[9] = source[6];
}
// the other values in the matrix are not relevant for rotations
return result;
}
private void rootMeanSquareBuffer(float[] target, float[] values) {
final float amplification = 200.0f;
float buffer = 20.0f;
target[0] += amplification;
target[1] += amplification;
target[2] += amplification;
values[0] += amplification;
values[1] += amplification;
values[2] += amplification;
target[0] = (float) (Math
.sqrt((target[0] * target[0] * buffer + values[0] * values[0])
/ (1 + buffer)));
target[1] = (float) (Math
.sqrt((target[1] * target[1] * buffer + values[1] * values[1])
/ (1 + buffer)));
target[2] = (float) (Math
.sqrt((target[2] * target[2] * buffer + values[2] * values[2])
/ (1 + buffer)));
target[0] -= amplification;
target[1] -= amplification;
target[2] -= amplification;
values[0] -= amplification;
values[1] -= amplification;
values[2] -= amplification;
}
private void loadNewSensorData(SensorEvent event) {
final int type = event.sensor.getType();
if (type == Sensor.TYPE_ACCELEROMETER) {
accelGData = event.values.clone();
}
if (type == Sensor.TYPE_MAGNETIC_FIELD) {
magnetData = event.values.clone();
}
if (type == Sensor.TYPE_ORIENTATION) {
orientationData = event.values.clone();
}
}
private void logOutput() {
if (mCount++ > 30) {
mCount = 0;
Log.d("Compass", "yaw0: " + (int) (resultingAngles[0])
+ " pitch1: " + (int) (resultingAngles[1]) + " roll2: "
+ (int) (resultingAngles[2]));
}
}
}
아직 코드를 테스트 할 수 없었습니다 (하지만 정말 흥미로워 보입니다). 내 관심을 끈 한 가지는 센서 데이터를 어떤 식 으로든 필터링 하지 않는 것 같습니다 .
센서 판독 값은 본질적으로 특히 자기 센서에서 매우 시끄 럽 습니다. 저역 통과 필터링을 구현하는 것이 좋습니다.
See my previous answer for further reading.
It would be easier to test and debug Method 5 using GLU's lookAt function: http://www.opengl.org/sdk/docs/man2/xhtml/gluLookAt.xml
Also, as villoren suggested it's good to filter your sensor data, but it wouldn't really cause bugs if you move de device slowly. If you want to try, a simple one would be as follows:
newValue = oldValue * 0.9 + sensorValue * 0.1;
oldValue = newValue;
After analyze your code above, in method 5 you are assigning the orientation data as follows,
resultingAngles[1] = orientationData[0]; // orientation z axis to y axis
resultingAngles[2] = orientationData[1]; // orientation x axis to z axis
resultingAngles[0] = orientationData[2]; // orientation y axis to x axis
You have done rotation in y z x manner. Try to change the orientation..
I think it might be the problem there.. Please check and let me know..
Please refer the documentation for the event values, http://developer.android.com/guide/topics/sensors/sensors_position.html
Thanks for your tough work..
Note that if you are getting consistently wrong readings, you may have to calibrate your compass, by moving it with your wrists in a figure 8.
Hard to explain this in words; watch this video: http://www.youtube.com/watch?v=sP3d00Hr14o
You can use and-engine for Using sensors with OpenGL Just check the example https://github.com/nicolasgramlich/AndEngineExamples/tree/GLES2/src/org/andengine/examples/app/cityradar
Check out the Sensor fusion demo app which uses different sensors (gyroscope, rotation-vector, accelerometer + compass, etc.) and renders the outputs from the onSensorChanged-events as a coloured cube that rotates accordingly to your phone.
The results from those events are stored as quaternions and rotation matrices and used in this class which OpenGL.
'Program Tip' 카테고리의 다른 글
Parse Javascript API를 Appcelerator와 어떻게 통합하고 문서화되지 않은 호출을 사용하지 않습니까? (0) | 2020.11.25 |
---|---|
MonadRef를 사용하여 MonadCont 구현 (0) | 2020.11.25 |
five.grok에서 ZCML allowed_attributes 해당 메서드 (0) | 2020.11.25 |
Jenkins + Play 1.2.4 : cobertura 잠금 파일 / 보고서 관련 문제 (0) | 2020.11.25 |
jquery가 메모리를 너무 많이 누수하는 이유는 무엇입니까? (0) | 2020.11.25 |