Touchpad inactive after gestures configuration

I wanted to activate additional gestures on my Dell Vostro laptop, but I did worse than better because my touchpad no longer works at all.
It’s a PS / 2 Generic Mouse but I don’t know if it should use the Synaptics or Libinput drivers.
I tried to uninstall one or the other, I even lost my keyboard, which I found since phew! But the touchpad could not find it.
I’m on Ubunutu 20.04

catalina – Catalyst apps don’t seem to support three-finger swipe gestures

I finally updated to Catalina, and it doesn’t seem that any Catalyst apps I’ve tested support three-finger swipe gestures – they seem to expect them to be done with two fingers regardless of this setting:

System preferences trackpad swipe setting

It’s odd because every other Mac app seems to respect this setting, so two-finger swipes side-to-side scroll horizontally, and only three finger gestures will go back/forward. Alas, it seems catalyst apps completely ignore this. Does anyone know whether it’s possible for developers to support three-finger gestures in their Catalyst apps, or is this just something that’s broken at a framework level?

gestures – Why don’t all android apps use fullscreen and fitSystemWindows?

As Android 10 comes with the iOS like gesture navigation, the gesture pill appears like the traditional navigation bar, taking up the whole width at the bottom, just less height, leaving a stupid black bar at the bottom in nearly every apps. While in iOS, all apps draw their background under the pill and keep the UI control components above the pill. After some researching, I figured out to achieve the same behaviour as iOS, Android developers should enable fullscreen and apply fitSystemWindows for UI control components. This could make the apps look so much better without the stupid wasted area at the bottom. So why there are not many apps doing that? Not even some “supposed to be fullscreen games” like Pokemon GO.

SwiftUI: multi-touch gesture / multiple gestures

Is there a way in SwiftUI to track multiple gestures at once? I want my single main view to be able to track multiple fingers by dragging at once.

ZStack {
    Color.black
      .edgesIgnoringSafeArea(.all)
      .gesture(DragGesture(minimumDistance: 0)
               .onChanged { (value) in
                 //some logic
               }.onEnded { (value) in
                  //more logic         
               })
       //other code
}

I have this code, however I can only process one drag gesture at a time. If you drag a finger and then try to add another, the first one stops.

I am trying to achieve an effect where there are multiple fingers on the screen at the same time. Each finger drags a circle simultaneously (one circle follows each finger).

I see simultaneous gestures in the Apple documentation, but this refers to one gesture activating multiple blocks.

Search – How to disable touchpad gestures in Windows?

I have discovered a way, we have to go to the control panel -> Hardware and sound -> mouse -> find the settings in the right center -> select several fingers -> three fingers -> uncheck "enable touch" and if you wish , you can clear the "Enable gestures" checkbox, this really resolved my search pop-up window on the Windows laptop.
See the following image, which shows the highlights.

enter the description of the image here

lenovo – Kubuntu 18.04 – Touchpad recognized as mouse – Multitouch gestures do not work

I have a problem with my laptop, as the title says:
The trackpad is recognized as a mouse, so I don't have multitouch gestures, other than scrolling.

I cannot change the applications with gestures, I cannot change the specific settings of the tracking bar, because, according to the System Configuration panel, I have a mouse and not a touch panel.

I had this same problem in Linux Mint and Ubuntu Mate 18.04, so I think it is not a specific distribution problem.

I have tried to follow other guidelines for multitouch gestures to work, but most are for earlier versions of the kernel and focus on making the touchpad work, AFAIK.

  • My laptop is a Lenovo V330-15IKB.
  • The version of my kernel is 5.3.0-26.generic
  • running dmseg | grep elan I have no results, and I run:
dmesg | grep i2c         
[    2.224353] i2c /dev entries driver
[    4.969912] i2c_hid i2c-AUI1657:00: i2c-AUI1657:00 supply vdd not found, using dummy regulator
[    4.969924] i2c_hid i2c-AUI1657:00: i2c-AUI1657:00 supply vddl not found, using dummy regulator
[    5.218914] input: AUI1657:00 044E:121E Mouse as /devices/pci0000:00/0000:00:15.0/i2c_designware.0/i2c-5/i2c-AUI1657:00/0018:044E:121E.0001/input/input7
[    5.219023] input: AUI1657:00 044E:121E Keyboard as /devices/pci0000:00/0000:00:15.0/i2c_designware.0/i2c-5/i2c-AUI1657:00/0018:044E:121E.0001/input/input8
[    5.219075] input: AUI1657:00 044E:121E as /devices/pci0000:00/0000:00:15.0/i2c_designware.0/i2c-5/i2c-AUI1657:00/0018:044E:121E.0001/input/input9
[    5.219139] hid-generic 0018:044E:121E.0001: input,hidraw0: I2C HID v1.00 Mouse [AUI1657:00 044E:121E] on i2c-AUI1657:00

  sudo acpidump | grep -C3 ELAN
  E730: 4E 41 32 42 34 31 00 5F 48 49 44 A1 10 70 0D 53  NA2B41._HID..p.S
  E740: 59 4E 41 32 42 34 32 00 5F 48 49 44 A1 4E 07 A0  YNA2B42._HID.N..
  E750: 34 93 61 0A 02 A0 1D 91 93 60 0A 04 91 93 60 0A  4.a......`....`.
  E760: 02 93 60 00 70 0D 45 4C 41 4E 30 36 31 37 00 5F  ..`.p.ELAN0617._
  E770: 48 49 44 A1 10 70 0D 45 4C 41 4E 30 36 31 38 00  HID..p.ELAN0618.
  E780: 5F 48 49 44 A1 46 04 A0 32 93 61 0A 04 A0 1C 91  _HID.F..2.a.....
  E790: 93 60 0A 04 91 93 60 0A 02 93 60 00 70 0D 41 55  .`....`...`.p.AU
  E7A0: 49 31 36 35 36 00 5F 48 49 44 A1 0F 70 0D 41 55  I1656._HID..p.AU

I come from Windows, and there, I had this working really well. Could someone lend me a hand to make this work? Thank you!

comparative review: UITableView that allows eliminating some types of requests with sliding gestures

I have a table view that shows a list of objects called Requests. It has 3 segments. Namely Accepted, Received Y Issued. And the objects for each segment are in 3 matrices.

enter the description of the image here

I want to enable row deletion (in turn, objects) only for Accepted Y Issued segments

enter the description of the image here

This is my current implementation.

func tableView(_ tableView: UITableView, trailingSwipeActionsConfigurationForRowAt indexPath: IndexPath) -> UISwipeActionsConfiguration? {

    var deleteAction: UIContextualAction?

    switch currentShowingStatus {
    case .accepted:
        deleteAction = UIContextualAction(style: .destructive, title: "Delete") { action, view, completionHandler in
            let request = self.acceptedRequests(indexPath.row)
            completionHandler(true)
        }
    case .sent:
        deleteAction = UIContextualAction(style: .destructive, title: "Delete") { action, view, completionHandler in
            let request = self.sentRequests(indexPath.row)
            completionHandler(true)
        }
    default:
        break
    }


    if let deleteAction = deleteAction {
        let configuration = UISwipeActionsConfiguration(actions: (deleteAction))
        configuration.performsFirstActionWithFullSwipe = false
        return configuration
    } else {
        return UISwipeActionsConfiguration(actions: ())
    }
}

As you can see, there is quite a bit of duplicate code. And I can't extract the UIContextualAction part of the statement due to its completion controller. Selecting the object to be deleted occurs there.

I can't define either UISwipeActionsConfiguration as a local variable and reduce its duplication because the class must be initialized with UIContextualAction instances

Note: You must pass an empty action array to UISwipeActionsConfiguration in order to do not Shows the swipe actions in the cell. Getting back nil does not

So, all this has produced an ugly code. I wonder if there is a better way to refactor this.


Alternative approach

Instead of first verifying the segment and then adding multiple deletion actions, I put the verification part of the segment into a termination controller like this.

func tableView(_ tableView: UITableView, trailingSwipeActionsConfigurationForRowAt indexPath: IndexPath) -> UISwipeActionsConfiguration? {
    var actions = (UIContextualAction)()

    let deleteAction = UIContextualAction(style: .destructive, title: "Delete") { action, view, completionHandler in
        switch self.currentShowingStatus {
        case .accepted:
            let request = self.acceptedRequests(indexPath.row)
            self.deleteClientRequest(request)
        case .sent:
            let request = self.sentRequests(indexPath.row)
            self.deleteClientRequest(request)
        default:
            break
        }
        completionHandler(true)
    }

    switch currentShowingStatus {
    case .accepted, .sent:
        actions.append(deleteAction)
    case .received:
        actions.removeAll()
    }

    let configuration = UISwipeActionsConfiguration(actions: actions)
    configuration.performsFirstActionWithFullSwipe = false
    return configuration
}

The disadvantage is that I still have to verify the segment again to show the slip action just to Accepted Y Issued segments

Foot 9.0: gestures in Android 9 and what it means for Android applications backward compatibility

I heard about the gesture function in Android 9, and that somehow replaces the ordinary navigation.

Is this optional feature generally on smartphones with Android 9? (ie, is it disabled by default? Or is it enabled by default, but can it be disabled?)

Will the applications of Android 8 and lower versions simply not allow normal navigation on phones with Android 9 with gestures enabled? (Assuming that in general these can be disabled)

2d – How can I use panoramic gestures to zoom in and out of a camera?

You can zoom in and out of a camera by driving the InputEventPanGesture entry event:

extends camara2d

var maximum_zoom_in = 0.15
male minimum_zoom_out = 4
var zoom_sensitivity = 0.01

func _unhandled_input (event):
If the event is InputEventPanGesture:
var zoom_amount = event.delta.y * zoom_sensitivity
var new_zoom = zoom.y + zoom_amount
yes (new_zoom < maximum_zoom_in):
            new_zoom = maximum_zoom_in
        elif (new_zoom > minimum_zoom_out):
new_zoom = minimum_zoom_out
zoom = Vector2 (new_zoom, new_zoom)

6.0 marshmallow – How to imitate gestures in developer mode?

When I turn on USB debugging, I get the following warning: "Developers can use special software to mimic gestures, change settings and grant permissions through USB."

Can someone tell me exactly how I can mimic gestures using some special software on my PC and, if possible, how to activate a touch input on a part of the Android device when I touch a volume button or a fingerprint scanner.