I am trying to use WebRTC and HTML 5 to achieve this. I am very new to WebRTC. So in order to achieve the task i tried using getUserMedia
as an example to show the video my browser is playing.
As an experiment, right now, playing video and live stream of that video is on same html page.
My HTML code is the following:
<div id="video_player_box">
<video id="my_video" width="1100" height="600" >
<source src="path/to/my/video/">
</video>
</div>
<div id="container">
<h1><a href="../index.html" title="simpl.info home page">simpl.info</a> getUserMedia</h1>
<video autoplay></video>
<p>The <code>MediaStream</code> object <code>stream</code> passed to the <code>getUserMedia()</code> callback in this demo is in global scope, so you can inspect it from the console.</p>
</div>
I am sourcing js like this:
'use strict';
navigator.getUserMedia = navigator.getUserMedia ||
navigator.webkitGetUserMedia || navigator.mozGetUserMedia;
var constraints = {
audio: false,
video: true
};
var video = document.getElementById("my_video");
function successCallback(stream) {
window.stream = stream; // stream available to console
video.src = document.getElementById("my_video");
}
function errorCallback(error) {
console.log('navigator.getUserMedia error: ', error);
}
navigator.getUserMedia(constraints, successCallback, errorCallback);
Can anyone help me with this and tell me if I am going in the right direction? or should I use some other method. Because this has not worked till now. I am still reading about WebRTC.
Note: I don't want to use their(webrtc) screen sharing method for this functionality