Call recording and reports are not generating in S3 bucket - amazon-connect

Call recording and reports are not generating in S3 bucket .
S3 bucket is created but call recording and reports are not saved . I have added Set call recording behavior in contact flow before customer inputs but even then nothing is saved in S3 . S3 bucket has all the permission of read , write, update.
Can someone help me with this , I am stuck with this issue from last two days .
Thanks in advance

For recording to be generated you must enable recording in the contact flow used by the call. Once recording is enabled for a given call, the recording file will appear in the S3 bucket shortly after the call ends.
Note that recordings only occur when the caller is connected to an agent, LEX/ interactions are never recorded.
For reports, the S3 bucket only serves as a target location for scheduled exports of reports. If you don’t have any reports exports scheduled then you will never see anything in the reports bucket.

Based on your screenshot the call is never connected to an agent. Call recording will not start until the call is connected to an agent.

Related

Is it possible to initiate an upload when app is suspended?

Here is the user experience I am hoping to achieve: In my app user can take some photos and upload them to Amazon S3. It is likely that user won't have internet available at that time (nature of app and target market). So I want to store those photos in device if internet is not available and automatically start uploading them whenever internet is available.
And this is where I need help: How can I achieve this? If not completely, how much closer can I get to this user experience? Like stores the photos and start uploading when users opens the app next time or show a notification when internet is available, then user can tap on it to open app and start uploading etc.
Additional information: Each upload will take ~1 minute to complete. After an upload is complete I also need to make an http request (if that matters).
Yes, this is possible. Look into URLSessionUploadTask:
Upload tasks are similar to data tasks, but they also send data (often in the form of a file), and support background uploads while
the app is not running.

Rails streaming and uploading at the same page

I have a page where users see the uploaded videos. Above the videos, there is an "Update" form with the ability to upload videos to the collection. The form uses realtime uploading (Upload starts as soon as user chooses file). And when the upload progress is 100%. The user hits "Update" button to update video collection.
The problem occurs when the user wants to upload a file when watching a video. The upload does not progress. There are no messages or errors in server development log or client javascript console. It just hangs (the video continues playing though). The upload usually hangs at the start (5%, 20% depending on the video file size).
By the way; i use refile gem for managing realtime uploads, storage, file serving, etc. Also it should be known that, i use same rails application server (Thin 1.6.3) to serve files and accept uploads. (Maybe the server gets busy and can not accept uploads? Because if i don't start playing the video, the upload progresses smoothly and completes without a problem.)
Could anyone point me in the right direction? Where to look? Are there any parameters i should set somewhere?

licode publishing licode stream not working

I am developing a video conference application using licode having multiple users(suppose 4).
I want that every user can view his webcam's video but he can publish his video in conference room only when he gets the permission.
I get access of camera using following.
localStream.init();
localStream.show("myVideo");
this is working fine.
Through a script we decide which user will get permission of publishing stream, under the script i am using following code to publish users stream.
room.publish(localStream);
but through this users stream is not publishing under the room, please tell me what i am doing wrong.
also is there any process to check how many streams in the room??
Thanks
The localStream is always available and can be used to publish anytime. Just recheck your code again. I would suggest to use setTimeout and publish the stream after 30 seconds your localstream is generated. I am sure this will work.

Youtube API - Remove Duplicate Rejected as well as Original copy

Every now and then when I push a video through the Youtube API I do not get the success result back to my server with the appropriate video ID. (assuming some kind of random connectivity issue)
This creates an issue where my server assumes the video hasn't uploaded it so it attempts to upload it again. Which then creates a duplicate.
How would I find out and remove any duplicates as well as the 1st uploaded video so I can start the process again and upload a fresh copy?
Is there a way to tell youtube to delete the video if it can not return the ID successfully to my server?
Cheers

How can I get last transaction on S3 bucket?

Am a beginner of S3 AWS SDK. and getting problem in my project.
I want to get uploaded or downloaded size of file which is currently uploaded. Actually the functionality of my application is that it will upload contents directly from client browser to Amazon S3. But if transfer of data interrupted and if exception is raised then i cant track that how much data of file has be transferred.
If data transfer is interrupted you will have to start all over. There is no way to resume the transfer where you left off. check out Amazon S3 official forum for more info.

Resources