URI:
   DIR Return Create A Forum - Home
       ---------------------------------------------------------
       MiT Labs
  HTML https://mitlabsindia.createaforum.com
       ---------------------------------------------------------
       *****************************************************
   DIR Return to: General
       *****************************************************
       #Post#: 8--------------------------------------------------
       Android Soft Keyboard tutorial in Swipe/Multi Touch Typing
       By: PatrickManzano Date: July 11, 2013, 10:54 am
       ---------------------------------------------------------
       Hi, Im developing my custom Android Softkeyboard application and
       I am now working is Swipe typing....or the multi-touch
       typing,..but i dont know how. Can anyone help me?
       Thanks
       I have some ideas on how this happen but im really not a
       programmer so I want to learn from you guys
       1) Capture touch events on the keyboard view
       2) Get (x,y) coordinates from the touch event
       3) Keep a list of all (x,y) coordinates of the current swipe
       (ArrayList or LinkedList)
       4) When the user lifts up their finger and ends the swipe
       (another touch event), use the list of touch event points to
       predict the word''
       In SWIPE we need to have dictionary to do the swipe typing
       Please click the link to see what it's like
  HTML http://www.facebook.com/l.php?u=https%3A%2F%2Fplay.google.com%2Fstore%2Fapps%2Fdetails%3Fid%3Dcom.cbnewham.ascribblefree.android%26feature%3Dsearch_result&h=QAQHioT9E
       Here is one of the softkeyboard that I'd used to make a good
       starting point in developing my own keyboard.
  HTML https://www.google.com.ph/url?sa=t&rct=j&q=&esrc=s&source=web&cd=2&cad=rja&ved=0CDUQFjAB&url=http%3A%2F%2Fcode.google.com%2Fp%2Fscandinavian-keyboard%2F&ei=ycHdUczrNMKfiQeK3YCACg&usg=AFQjCNHO6UnbiBMLA5z_t8BYKUFjsC1hGA&sig2=U4e6wyo-LS-7nf2mdfgt3w&bvm=bv.48705608,d.aGc
       So that it is not too hard to start from scratch and this
       keyboard also have the dictionary that'll be used in Swipe
       Typing sir.
       NEED YOUR HELP GUYS WITH A COMPLETE TUTORIAL
       #Post#: 10--------------------------------------------------
       Re: Android Soft Keyboard tutorial in Swipe/Multi Touch Typing
       By: ShivamMiT Date: July 11, 2013, 11:22 am
       ---------------------------------------------------------
       Good Progress Though! I was thinking if you use any of the
       keyboards which already use this type of gestures/swipe as a
       base and then theme them or edit them to finalise them. That
       would be a way better.
       P.S: Try to add Voice typing.
       #Post#: 11--------------------------------------------------
       Re: Android Soft Keyboard tutorial in Swipe/Multi Touch Typing
       By: PatrickManzano Date: July 11, 2013, 6:47 pm
       ---------------------------------------------------------
       Hello sir, that would be a lot better but I cant find even 1
       free source code of keyboard with swipe typing. Not even one
       tutorial exist. That's why I need you sir to help me with swipe.
       If you've found one existing keyboard with swipe typing, it will
       really help me.
       Sir, by the way, in my modified keyboard, I'd already added the
       voice typing because there is one tutorial and that's why I am
       now working with swipe typing but I cant and I dont know how, Im
       really stack up here in swipe sir.
       Please I really need to learn the swipe.
       Really need your help sir.
       #Post#: 12--------------------------------------------------
       Re: Android Soft Keyboard tutorial in Swipe/Multi Touch Typing
       By: ShivamMiT Date: July 11, 2013, 11:44 pm
       ---------------------------------------------------------
       The word “multitouch” gets thrown around quite a bit and it’s
       not always clear what people are referring to. For some it’s
       about hardware capability, for others it refers to specific
       gesture support in software. Whatever you decide to call it,
       today we’re going to look at how to make your apps and views
       behave nicely with multiple fingers on the screen.
       This post is going to be heavy on code examples. It will cover
       creating a custom View that responds to touch events and allows
       the user to manipulate an object drawn within it. To get the
       most out of the examples you should be familiar with setting up
       an Activity and the basics of the Android UI system. Full
       project source will be linked at the end.
       We’ll begin with a new View class that draws an object (our
       application icon) at a given position:
       public class TouchExampleView extends View {
       private Drawable mIcon;
       private float mPosX;
       private float mPosY;
       
       private float mLastTouchX;
       private float mLastTouchY;
       
       public TouchExampleView(Context context) {
       this(context, null, 0);
       }
       
       public TouchExampleView(Context context, AttributeSet attrs)
       {
       this(context, attrs, 0);
       }
       
       public TouchExampleView(Context context, AttributeSet attrs,
       int defStyle) {
       super(context, attrs, defStyle);
       mIcon =
       context.getResources().getDrawable(R.drawable.icon);
       mIcon.setBounds(0, 0, mIcon.getIntrinsicWidth(),
       mIcon.getIntrinsicHeight());
       }
       @Override
       public void onDraw(Canvas canvas) {
       super.onDraw(canvas);
       
       canvas.save();
       canvas.translate(mPosX, mPosY);
       mIcon.draw(canvas);
       canvas.restore();
       }
       @Override
       public boolean onTouchEvent(MotionEvent ev) {
       // More to come here later...
       return true;
       }
       }
       MotionEvent
       The Android framework’s primary point of access for touch data
       is the android.view.MotionEvent class. Passed to your views
       through the onTouchEvent and onInterceptTouchEvent methods,
       MotionEvent contains data about “pointers,” or active touch
       points on the device’s screen. Through a MotionEvent you can
       obtain X/Y coordinates as well as size and pressure for each
       pointer. MotionEvent.getAction() returns a value describing what
       kind of motion event occurred.
       One of the more common uses of touch input is letting the user
       drag an object around the screen. We can accomplish this in our
       View class from above by implementing onTouchEvent as follows:
       @Override
       public boolean onTouchEvent(MotionEvent ev) {
       final int action = ev.getAction();
       switch (action) {
       case MotionEvent.ACTION_DOWN: {
       final float x = ev.getX();
       final float y = ev.getY();
       
       // Remember where we started
       mLastTouchX = x;
       mLastTouchY = y;
       break;
       }
       
       case MotionEvent.ACTION_MOVE: {
       final float x = ev.getX();
       final float y = ev.getY();
       
       // Calculate the distance moved
       final float dx = x - mLastTouchX;
       final float dy = y - mLastTouchY;
       
       // Move the object
       mPosX += dx;
       mPosY += dy;
       
       // Remember this touch position for the next move event
       mLastTouchX = x;
       mLastTouchY = y;
       
       // Invalidate to request a redraw
       invalidate();
       break;
       }
       }
       
       return true;
       }
       The code above has a bug on devices that support multiple
       pointers. While dragging the image around the screen, place a
       second finger on the touchscreen then lift the first finger. The
       image jumps! What’s happening? We’re calculating the distance to
       move the object based on the last known position of the default
       pointer. When the first finger is lifted, the second finger
       becomes the default pointer and we have a large delta between
       pointer positions which our code dutifully applies to the
       object’s location.
       If all you want is info about a single pointer’s location, the
       methods MotionEvent.getX() and MotionEvent.getY() are all you
       need. MotionEvent was extended in Android 2.0 (Eclair) to report
       data about multiple pointers and new actions were added to
       describe multitouch events. MotionEvent.getPointerCount()
       returns the number of active pointers. getX and getY now accept
       an index to specify which pointer’s data to retrieve.
       Index vs. ID
       At a higher level, touchscreen data from a snapshot in time may
       not be immediately useful since touch gestures involve motion
       over time spanning many motion events. A pointer index does not
       necessarily match up across complex events, it only indicates
       the data’s position within the MotionEvent. However this is not
       work that your app has to do itself. Each pointer also has an ID
       mapping that stays persistent across touch events. You can
       retrieve this ID for each pointer using
       MotionEvent.getPointerId(index) and find an index for a pointer
       ID using MotionEvent.findPointerIndex(id).
       Feeling Better?
       Let’s fix the example above by taking pointer IDs into account.
       private static final int INVALID_POINTER_ID = -1;
       // The ‘active pointer’ is the one currently moving our object.
       private int mActivePointerId = INVALID_POINTER_ID;
       // Existing code ...
       @Override
       public boolean onTouchEvent(MotionEvent ev) {
       final int action = ev.getAction();
       switch (action & MotionEvent.ACTION_MASK) {
       case MotionEvent.ACTION_DOWN: {
       final float x = ev.getX();
       final float y = ev.getY();
       
       mLastTouchX = x;
       mLastTouchY = y;
       // Save the ID of this pointer
       mActivePointerId = ev.getPointerId(0);
       break;
       }
       
       case MotionEvent.ACTION_MOVE: {
       // Find the index of the active pointer and fetch its
       position
       final int pointerIndex =
       ev.findPointerIndex(mActivePointerId);
       final float x = ev.getX(pointerIndex);
       final float y = ev.getY(pointerIndex);
       
       final float dx = x - mLastTouchX;
       final float dy = y - mLastTouchY;
       
       mPosX += dx;
       mPosY += dy;
       
       mLastTouchX = x;
       mLastTouchY = y;
       
       invalidate();
       break;
       }
       
       case MotionEvent.ACTION_UP: {
       mActivePointerId = INVALID_POINTER_ID;
       break;
       }
       
       case MotionEvent.ACTION_CANCEL: {
       mActivePointerId = INVALID_POINTER_ID;
       break;
       }
       
       case MotionEvent.ACTION_POINTER_UP: {
       // Extract the index of the pointer that left the touch
       sensor
       final int pointerIndex = (action &
       MotionEvent.ACTION_POINTER_INDEX_MASK)
       >> MotionEvent.ACTION_POINTER_INDEX_SHIFT;
       final int pointerId = ev.getPointerId(pointerIndex);
       if (pointerId == mActivePointerId) {
       // This was our active pointer going up. Choose a
       new
       // active pointer and adjust accordingly.
       final int newPointerIndex = pointerIndex == 0 ? 1 :
       0;
       mLastTouchX = ev.getX(newPointerIndex);
       mLastTouchY = ev.getY(newPointerIndex);
       mActivePointerId = ev.getPointerId(newPointerIndex);
       }
       break;
       }
       }
       
       return true;
       }
       There are a few new elements at work here. We’re switching on
       action & MotionEvent.ACTION_MASK now rather than just action
       itself, and we’re using a new MotionEvent action constant,
       MotionEvent.ACTION_POINTER_UP. ACTION_POINTER_DOWN and
       ACTION_POINTER_UP are fired whenever a secondary pointer goes
       down or up. If there is already a pointer on the screen and a
       new one goes down, you will receive ACTION_POINTER_DOWN instead
       of ACTION_DOWN. If a pointer goes up but there is still at least
       one touching the screen, you will receive ACTION_POINTER_UP
       instead of ACTION_UP.
       The ACTION_POINTER_DOWN and ACTION_POINTER_UP events encode
       extra information in the action value. ANDing it with
       MotionEvent.ACTION_MASK gives us the action constant while
       ANDing it with ACTION_POINTER_INDEX_MASK gives us the index of
       the pointer that went up or down. In the ACTION_POINTER_UP case
       our example extracts this index and ensures that our active
       pointer ID is not referring to a pointer that is no longer
       touching the screen. If it was, we select a different pointer to
       be active and save its current X and Y position. Since this
       saved position is used in the ACTION_MOVE case to calculate the
       distance to move the onscreen object, we will always calculate
       the distance to move using data from the correct pointer.
       This is all the data that you need to process any sort of
       gesture your app may require. However dealing with this
       low-level data can be cumbersome when working with more complex
       gestures. Enter GestureDetectors.
       GestureDetectors
       Since apps can have vastly different needs, Android does not
       spend time cooking touch data into higher level events unless
       you specifically request it. GestureDetectors are small filter
       objects that consume MotionEvents and dispatch higher level
       gesture events to listeners specified during their construction.
       The Android framework provides two GestureDetectors out of the
       box, but you should also feel free to use them as examples for
       implementing your own if needed. GestureDetectors are a pattern,
       not a prepacked solution. They’re not just for complex gestures
       such as drawing a star while standing on your head, they can
       even make simple gestures like fling or double tap easier to
       work with.
       android.view.GestureDetector generates gesture events for
       several common single-pointer gestures used by Android including
       scrolling, flinging, and long press. For Android 2.2 (Froyo)
       we’ve also added android.view.ScaleGestureDetector for
       processing the most commonly requested two-finger gesture: pinch
       zooming.
       Gesture detectors follow the pattern of providing a method
       public boolean onTouchEvent(MotionEvent). This method, like its
       namesake in android.view.View, returns true if it handles the
       event and false if it does not. In the context of a gesture
       detector, a return value of true implies that there is an
       appropriate gesture currently in progress. GestureDetector and
       ScaleGestureDetector can be used together when you want a view
       to recognize multiple gestures.
       To report detected gesture events, gesture detectors use
       listener objects passed to their constructors.
       ScaleGestureDetector uses
       ScaleGestureDetector.OnScaleGestureListener.
       ScaleGestureDetector.SimpleOnScaleGestureListener is offered as
       a helper class that you can extend if you don’t care about all
       of the reported events.
       Since we are already supporting dragging in our example, let’s
       add support for scaling. The updated example code is shown
       below:
       private ScaleGestureDetector mScaleDetector;
       private float mScaleFactor = 1.f;
       // Existing code ...
       public TouchExampleView(Context context, AttributeSet attrs, int
       defStyle) {
       super(context, attrs, defStyle);
       mIcon = context.getResources().getDrawable(R.drawable.icon);
       mIcon.setBounds(0, 0, mIcon.getIntrinsicWidth(),
       mIcon.getIntrinsicHeight());
       
       // Create our ScaleGestureDetector
       mScaleDetector = new ScaleGestureDetector(context, new
       ScaleListener());
       }
       @Override
       public boolean onTouchEvent(MotionEvent ev) {
       // Let the ScaleGestureDetector inspect all events.
       mScaleDetector.onTouchEvent(ev);
       
       final int action = ev.getAction();
       switch (action & MotionEvent.ACTION_MASK) {
       case MotionEvent.ACTION_DOWN: {
       final float x = ev.getX();
       final float y = ev.getY();
       
       mLastTouchX = x;
       mLastTouchY = y;
       mActivePointerId = ev.getPointerId(0);
       break;
       }
       
       case MotionEvent.ACTION_MOVE: {
       final int pointerIndex =
       ev.findPointerIndex(mActivePointerId);
       final float x = ev.getX(pointerIndex);
       final float y = ev.getY(pointerIndex);
       // Only move if the ScaleGestureDetector isn't
       processing a gesture.
       if (!mScaleDetector.isInProgress()) {
       final float dx = x - mLastTouchX;
       final float dy = y - mLastTouchY;
       mPosX += dx;
       mPosY += dy;
       invalidate();
       }
       mLastTouchX = x;
       mLastTouchY = y;
       break;
       }
       
       case MotionEvent.ACTION_UP: {
       mActivePointerId = INVALID_POINTER_ID;
       break;
       }
       
       case MotionEvent.ACTION_CANCEL: {
       mActivePointerId = INVALID_POINTER_ID;
       break;
       }
       
       case MotionEvent.ACTION_POINTER_UP: {
       final int pointerIndex = (ev.getAction() &
       MotionEvent.ACTION_POINTER_INDEX_MASK)
       >> MotionEvent.ACTION_POINTER_INDEX_SHIFT;
       final int pointerId = ev.getPointerId(pointerIndex);
       if (pointerId == mActivePointerId) {
       // This was our active pointer going up. Choose a
       new
       // active pointer and adjust accordingly.
       final int newPointerIndex = pointerIndex == 0 ? 1 :
       0;
       mLastTouchX = ev.getX(newPointerIndex);
       mLastTouchY = ev.getY(newPointerIndex);
       mActivePointerId = ev.getPointerId(newPointerIndex);
       }
       break;
       }
       }
       
       return true;
       }
       @Override
       public void onDraw(Canvas canvas) {
       super.onDraw(canvas);
       
       canvas.save();
       canvas.translate(mPosX, mPosY);
       canvas.scale(mScaleFactor, mScaleFactor);
       mIcon.draw(canvas);
       canvas.restore();
       }
       private class ScaleListener extends
       ScaleGestureDetector.SimpleOnScaleGestureListener {
       @Override
       public boolean onScale(ScaleGestureDetector detector) {
       mScaleFactor *= detector.getScaleFactor();
       
       // Don't let the object get too small or too large.
       mScaleFactor = Math.max(0.1f, Math.min(mScaleFactor,
       5.0f));
       invalidate();
       return true;
       }
       }
       This example merely scratches the surface of what
       ScaleGestureDetector offers. The listener methods receive a
       reference to the detector itself as a parameter that can be
       queried for extended information about the gesture in progress.
       See the ScaleGestureDetector API documentation for more details.
       Now our example app allows a user to drag with one finger, scale
       with two, and it correctly handles passing active pointer focus
       between fingers as they contact and leave the screen. You can
       download the final sample project at
  HTML http://code.google.com/p/android-touchexample/.
       It requires the
       Android 2.2 SDK (API level 8) to build and a 2.2 (Froyo) powered
       device to run.
       From Example to Application
       In a real app you would want to tweak the details about how
       zooming behaves. When zooming, users will expect content to zoom
       about the focal point of the gesture as reported by
       ScaleGestureDetector.getFocusX() and getFocusY(). The specifics
       of this will vary depending on how your app represents and draws
       its content.
       Different touchscreen hardware may have different capabilities;
       some panels may only support a single pointer, others may
       support two pointers but with position data unsuitable for
       complex gestures, and others may support precise positioning
       data for two pointers and beyond. You can query what type of
       touchscreen a device has at runtime using
       PackageManager.hasSystemFeature().
       As you design your user interface keep in mind that people use
       their mobile devices in many different ways and not all Android
       devices are created equal. Some apps might be used one-handed,
       making multiple-finger gestures awkward. Some users prefer using
       directional pads or trackballs to navigate. Well-designed
       gesture support can put complex functionality at your users’
       fingertips, but also consider designing alternate means of
       accessing application functionality that can coexist with
       gestures.
       #Post#: 13--------------------------------------------------
       Re: Android Soft Keyboard tutorial in Swipe/Multi Touch Typing
       By: ShivamMiT Date: July 11, 2013, 11:45 pm
       ---------------------------------------------------------
       I found it B]
       #Post#: 14--------------------------------------------------
       Re: Android Soft Keyboard tutorial in Swipe/Multi Touch Typing
       By: PraMit Date: July 11, 2013, 11:53 pm
       ---------------------------------------------------------
       and I found this :P
  HTML http://www.mediafire.com/?mi7u41q87dedb1q(FOR
       JELLYBEAN and ICS
       only)
  HTML http://www.mediafire.com/?3pfwbgmr2zu5qzt
       (the beta version)
       Enjoy!
       *****************************************************