问题
Now I try to make server using Raspberry Pi which send live stream image data to browser. The server side was written in Python & Tornado, while client side was written in HTML and javascript. Both use WebSocket. (I am a beginner of javascript.)
These are the codes
Server side:
class WSHandler(WebSocketHandler):
def initialize(self, camera):
self.camera = camera
cv.SetCaptureProperty(self.capture, cv.CV_CAP_PROP_FRAME_WIDTH, 480)
cv.SetCaptureProperty(self.capture, cv.CV_CAP_PROP_FRAME_HEIGHT, 360)
def open(self):
print("connection opened")
while True:
self.loop()
def loop(self):
img = self.camera.takeImage()
self.write_message(img, binary=True)
class Camera():
def __init__(self):
self.capture = cv.CaptureFromCAM(0)
def takeImage(self):
img = cv.QueryFrame(self.capture)
img = cv.EncodeImage(".jpg", img).tostring()
return img
def main():
camera = Camera()
app = tornado.web.Application([
(r"/camera", WSHandler, dict(camera=camera)),
])
http_server = tornado.httpserver.HTTPServer(app)
http_server.listen(8080)
IOLoop.instance().start()
if __name__ == "__main__":
main()
Client side:
javascript(client.js)
var canvas = document.getElementById("liveCanvas");;
var context = canvas.getContext("2d");
var ws = new WebSocket("ws://localhost:8080/camera");
ws.onopen = function(){
console.log("connection was established");
};
ws.onmessage = function(evt){
context.drawImage(evt.data,0,0);
};
html(index.html)
<html>
<head>
<title>livecamera</title>
<canvas id="liveCanvas" width="480" height="360"></canvas>
<script type="text/javascript" src="./client.js"></script>
</head>
</html>
When I access this 'index.html' while the server is running, next error appeared.
Uncaught TypeError: Failed to execute 'drawImage' on 'CanvasRenderingContext2D': No function was found that matched the signature provided.
I guess, this is caused by mistaking of data format sent from server.
My questions are, What data format should be used? How should the server send data? How should the client receive data?
回答1:
I found similar question between C++ and javascript Display image from blob using javascript and websockets
The Server side is same as before.
The Client side, 'ws.binaryType' has to be set to 'arraybuffer' to receive blob object. And It should be encoded by base64 and 'encode' function which is referred from link I wrote above.
The code:
javascript
var img = document.getElementById("liveImg");
var arrayBuffer;
var ws = new WebSocket("ws://localhost:8080/camera");
ws.binaryType = 'arraybuffer';
ws.onopen = function(){
console.log("connection was established");
};
ws.onmessage = function(evt){
arrayBuffer = evt.data;
img.src = "data:image/jpeg;base64," + encode(new Uint8Array(arrayBuffer));
};
function encode (input) {
var keyStr = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/=";
var output = "";
var chr1, chr2, chr3, enc1, enc2, enc3, enc4;
var i = 0;
while (i < input.length) {
chr1 = input[i++];
chr2 = i < input.length ? input[i++] : Number.NaN; // Not sure if the index
chr3 = i < input.length ? input[i++] : Number.NaN; // checks are needed here
enc1 = chr1 >> 2;
enc2 = ((chr1 & 3) << 4) | (chr2 >> 4);
enc3 = ((chr2 & 15) << 2) | (chr3 >> 6);
enc4 = chr3 & 63;
if (isNaN(chr2)) {
enc3 = enc4 = 64;
} else if (isNaN(chr3)) {
enc4 = 64;
}
output += keyStr.charAt(enc1) + keyStr.charAt(enc2) +
keyStr.charAt(enc3) + keyStr.charAt(enc4);
}
return output;
}
html
I replaced canvas tag to img tag
<html>
<head>
<title>livecamera</title>
<img id="liveImg" width="480" height="360"></canvas>
<script type="text/javascript" src="./client.js"></script>
</head>
</html>
来源:https://stackoverflow.com/questions/22554313/send-captured-images-from-python-server-to-javascript-client