问题
I am trying to make a screen capturing service. I have a physical PC and I would like to run a program that displays a document (ex. MSWord) and then take a screenshot. Problem is that if I use
using (var bitmap = new Bitmap(p.EndX - p.StartX, p.EndY - p.StartY))
{
using (Graphics g = Graphics.FromImage(bitmap))
{
g.CopyFromScreen(p.StartX, p.StartY, 0, 0,
p.EndX,p.EndY,new Size(p.EndX - p.StartX,
p.EndY - p.StartY));
}
bitmap.Save("result.png", ImageFormat.Png);
}
it works only if I have a LCD screen connected. (Or if I am connected via RDP). But whenever I unplug my screen / disconnect from RDP. Win32Com trows an error due to invalid handle - screen no longer exists.
I would like to ask whether there is a way how to force windows into thinking that there is some monitor (virtual/fake) so it can take a screenshot?
回答1:
I had the same instincts as Robert Harvey when I first read your question. I remember trying to set up machines as headless servers and running into all kinds of trouble. I could get everything working on the bench with a monitor connected, but as soon as I disconnected the monitor, the video hardware seemed to stop working altogether so that I couldn't even connect remotely to the system. I assume it's something to do with the VGA sync signal. When the video hardware isn't getting that sync signal, it shuts itself off. It is trivially bypassed by shorting the sync pins. If you're electrically-inclined, you can look up a pinout spec for the VGA connector and figure out what to do. Otherwise, you can often get away with buying and hooking up a VGA-to-DVI adapter that tricks the hardware into thinking that a monitor is connected.
But it doesn't sound like that's your problem, since you said that you can still connect to the machine using remote desktop, even without a screen connected. I suspect that the issue of whether or not a monitor is connected is a red herring. My guess if you log in to the machine then disconnect the monitor, everything will continue working fine. Likewise, if you connect a monitor but don't log in, nothing will work then either.
So that means your problem is just a logical one: when there's no screen, there's nothing displayed on that screen to take a picture of, so the function fails. The code that powers the CopyFromScreen
function is getting a handle to the virtual screen containing the desktop, and then blitting its image to a bitmap. If you aren't logged in, either locally or remotely, there is no desktop and no interactive applications running on it, and therefore CopyFromScreen
fails.
This question is actually more similar to the people who are trying to take screenshots from a service or an ASP.NET application. I have no idea what these people expect to happen. Likewise here. A Google search will reveal a few workarounds for this problem, many of them posted here on Stack Overflow. Generally, they all boil down to impersonating a particular user, who has a desktop that can be screenshotted. But I don't see how that will work for you.
If I understand the problem correctly, the solution is very simple. You don't need to "force windows into thinking that there is some monitor (virtual/fake) so it can take a screenshot". You just need to log in, start Microsoft Word, and run your code. Try setting your computer to automatically log in at startup. And when you're using Remote Desktop, just disconnect from the virtual session without logging off.
来源:https://stackoverflow.com/questions/23250862/windows-how-to-simulate-connected-screen-via-vga