This isn’t too hard once you realise the way to do it is to split the movement into it’s horizontal and vertical components. For simplicity we’ll assume 3d, and you can translate that into your 2d perspective however you like.
Your inputs will be your start and end positions, and gravity. Your output you’re trying to find will be the launch velocity you need to give to the projectile. Once you’ve got that, you can simulate it’s movement under gravity the normal way and it’ll hit the target.
First, you need to solve horizontally:
Ye olde movement equation:
s = ut + 0.5at^2
Horizontally we have no acceleration, so eliminate 0.5at^2
We can calculate distance s by finding the vector between the start and end points, then calculating it’s length.
Then we pick an arbitrary speed that we want it to move along the ground, substitute in and we can work out t, the time it will take to go from start point to end point.
Vertically:
s = ut + 0.5at^2
We want to start and end at the same height, so distance s is zero.
Acceleration is just gravity (careful of the sign).
Time we know from the horizontal calculation.
That simply gives us an equation with just unknown vertical speed u:
u = (-0.5 * gravity * time)^2) / time
You calculate that u, and that’s your vertical speed. So now your launch velocity is just the horizontal speed and the vertical speed. Done!