I want know how to perform a drag on android based in X, Y mouse coordinates? consider as two simple examples, the Team Viewer/QuickSupport drawing the \"password pattern\" on r
Have you tried to use AutoIt scripting?
You can save coordinates within specific windows/screens. You can hold mouseclick pressed while drawing the pattern.
I also have some example code / scripts for you if you want them!
EDIT:
According to this tutorial you can use Auto-IT on C#.
Follow these steps:
Using AutoItX3Lib;
AutoItX3 auto = new AutoItX3();
This is the complete example for executing a mouseclick:
Using AutoItX3Lib;
AutoItX3 auto = new AutoItX3();
auto.MouseClick("left", 78, 1133, 1, 35)
With the AutoIt Window Info Tool
you can check the coordinates you want to use.
Please note there are differences between mouse coordinates modes:
for example: auto.AutoItSetOption("MouseCoordMode", 1)
will use absolute screen coordinates. See source here.
For holding the mouseclick down, you can check MouseDown Function
Here is a example of a solution based on Edit 3 of question.
C# Windows Froms Application "formMain.cs":
using System.Net.Sockets;
private List<Point> lstPoints;
private void pictureBox1_MouseDown(object sender, MouseEventArgs e)
{
if (e.Button == MouseButtons.Left)
{
lstPoints = new List<Point>();
lstPoints.Add(new Point(e.X, e.Y));
}
}
private void PictureBox1_MouseMove(object sender, MouseEventArgs e)
{
if (e.Button == MouseButtons.Left)
{
lstPoints.Add(new Point(e.X, e.Y));
}
}
private void PictureBox1_MouseUp(object sender, MouseEventArgs e)
{
lstPoints.Add(new Point(e.X, e.Y));
StringBuilder sb = new StringBuilder();
foreach (Point obj in lstPoints)
{
sb.Append(Convert.ToString(obj) + ":");
}
serverSocket.Send("MDRAWEVENT" + sb.ToString() + Environment.NewLine);
}
android service "SocketBackground.java":
import java.net.Socket;
String xline;
while (clientSocket.isConnected()) {
BufferedReader xreader = new BufferedReader(new InputStreamReader(clientSocket.getInputStream(), StandardCharsets.UTF_8));
if (xreader.ready()) {
while ((xline = xreader.readLine()) != null) {
xline = xline.trim();
if (xline != null && !xline.trim().isEmpty()) {
if (xline.contains("MDRAWEVENT")) {
String coordinates = xline.replace("MDRAWEVENT", "");
String[] tokens = coordinates.split(Pattern.quote(":"));
Point[] moviments = new Point[tokens.length];
for (int i = 0; i < tokens.length; i++) {
String[] coordinates = tokens[i].replace("{", "").replace("}", "").split(",");
int x = Integer.parseInt(coordinates[0].split("=")[1]);
int y = Integer.parseInt(coordinates[1].split("=")[1]);
moviments[i] = new Point(x, y);
}
MyAccessibilityService.instance.mouseDraw(moviments, 2000);
}
}
}
}
}
android AccessibilityService "MyAccessibilityService.java":
public void mouseDraw(Point[] segments, int time) {
if (android.os.Build.VERSION.SDK_INT >= Build.VERSION_CODES.N) {
Path path = new Path();
path.moveTo(segments[0].x, segments[0].y);
for (int i = 1; i < segments.length; i++) {
path.lineTo(segments[i].x, segments[i].y);
GestureDescription.StrokeDescription sd = new GestureDescription.StrokeDescription(path, 0, time);
dispatchGesture(new GestureDescription.Builder().addStroke(sd).build(), new AccessibilityService.GestureResultCallback() {
@Override
public void onCompleted(GestureDescription gestureDescription) {
super.onCompleted(gestureDescription);
}
@Override
public void onCancelled(GestureDescription gestureDescription) {
super.onCancelled(gestureDescription);
}
}, null);
}
}
}