When a video is done transcoding, you'll see the
output property appear in the list asset endpoint response, containing URLs for every asset created. You'll generally want to use ABR streaming, so note the HLS URL as HLS has the widest device compatibility:
Then, use the Veeplay video player to render the video:
Direct upload is a feature we're currently developing, and will be available soon. Meanwhile, you'll need to provide a publicly accessible URL for the source media file. If you have the file in your local filesystem, a simple way to get a temporary URL is to use
ngrok to create a local fileserver - see ngrok's docs on how to achieve this.
No. After the Veeplay API processes the source file, you can safely make it private or remove it from your storage.
No. After the Veeplay API processes the source file, it is removed from our systems permanently.
See the supported inputs section of the API docs for a list of accepted formats.
When generating multiple renditions to support Adaptive Bitrate streaming with HLS and DASH, Veeplay doesn't use a static bitrate ladder - instead, we use machine learning to infer the optimal ladder for conversion per-title, based on properties of each individual video. This results in an optimal selection of renditions being generated, that maximize the perceived video quality while minimizing bandwidth requirements.
You can setup a webhook URL to receive notifications on every media asset status update during the ingestion workflow. Read more about setting up webhooks in the API documentation.
Yes, cropping is supported. See the
clip parameter of the create asset endpoint. Here's an example input that crops a 40s clip starting at 10 seconds:
Yes, overlays are supported. See the
overlays parameter of the create asset endpoint. Here's an example that adds a logo to the bottom left area of the video, with the width equal to 20% of the full video width:
Yes, audio normalization is supported. This will bring audio loudness levels of your input to a standard target level during encoding. The algorithm used for audio normalization is ITU-R BS.1770-1, and the target loudness value is -24 LKFS.
audio_normalization parameter of the create asset endpoint. Here's an example of applying audio normalization:
To use the Veeplay video players within your mobile or web apps, you need to register them inside your Veeplay account dashboard:
- for iOS, you'll need to specify your app's bundle identifier;
- for Android, you'll need to specify your app's package name;
- for web apps, you can use the player on
localhostwithout registering an app - otherwise, you'll need to specify your app's domain name.
To improve user experience, browsers and OSs enforce strict policies regarding video autoplay. Generally, autoplay is only allowed without sound before the user interacts with your app/domain in a meaningful way.
- For Chrome, video is also allowed to autoplay with sound, depending on the users' Media Engagement Index(MEI). This is a measurement of previous user interaction with a specific domain, and is highest on sites where you regularly play media. See your local MEI scores and read more about Chrome autoplay policies.
- For Safari, the user is in control over which sites are allowed to autoplay. By default, autoplay executes only if the video doesn’t contain an audio track, or if the video is muted. Read more about Safari's autoplay policy.
- For Firefox, all video with sound is prevented from autoplay by default. Read more about Firefox's autoplay policy.
To make sure your video will autoplay, set the player to mute after starting playback:
Yes, UI customizations are supported. See a list of controls customization options for JSON configuration. Similar properties are also natively available for each platform.
The Android & iOS players are singletons, so you can only have a single player instance at a time. Multiple players are, however, supported with the JS SDK: