Many webcast jobs may involve simply bringing a video encoder to an existing production, taking an audio/video source from the production team, connecting it to the encoder, connecting the encoder to the Internet, and pressing “start” - followed by coffee and staring at the green light for a few hours, hoping it stays on!
For the majority of smaller webcasts - particularly the “Adds value standard” - cost is a key constraint, yet they will require the webcaster to organize the production too. To deal with this type of client, having a good solid low-budget setup is key. This should be focused on the “bare minimum” setup required. Taking this approach, even when planning to double-up all your key kit elements, is a good idea: cheaper setup means lower cost when setting up a redundant rig in your spares bag.
There are obviously myriad technology options in the market, and while the final choice will be personal, it is probably useful to outline the “rigs” (my term for my webcast setups) that I use in production.
My own main rig is built around my MacBook Air, a Behringer Xenyx 302 USB micro mixer and my Roland VR-3EX.
I include in Figure 3.1 the wiring schematic for a recent webcast I set up to cover a drone race.
You will notice that I carry several other key items in the rig. Since we had four live drone feeds, I elected to take a second vision mixer to create a sub-mix specifically of the racing feeds. In this case I used a Roland VR3 specifically for the purpose, with the Preview output as a single source (four cameras to view) into channel 3 of the VR-3EX, and the main out of the VR3 was sent to channel 2 of the VR-3EX. This meant that at any time we could cut to a “quad” shot of all four race feeds, but by selecting specific channels on the VR3, I would then switch to channel 2 to output that selected channel on the main VR-3EX feed.
The audio feed was being sent to several places, so I used a small Behringer HA400 audio distribution amp to create splits for commentators, for the PA, and for the restricted site license (RSL) FM radio broadcast, which those on site could listen to on small FM radios, etc.
Commentators use radio mics connected directly into the VR-3EX. The VR3 produced no audio in this instance, but I could have those audio feeds connected into the VR-3EX were they required.
Why the Xenyx 302 USB? Well this is ultimately a luxury feature, but it is one I always use, on every webcast. Therefore it is very familiar to me, and my muscle memory, when something goes wrong, is instinctive and can “cut across” the rest of the production directly. This setup allows me to create an audio sub-mix where I can combine the laptop's audio out for playing music, or sound from prerecorded interstitial videos and also sounds from cartridges (I use an app called BossJock running on my iPad mini) and a separate mic mix
Figure 3.1 My webcast rig set up for a drone race webcast.
for my own booth mic. This means that even if the VR-3EX needed rewiring or adjusting mid-event I can continue with basic sound and commentary from the 302, and even video inserts directly from the laptop to fill in for any production outages. I find that having that extra level of production gives structure to the show, since there are times when I take back control from the VR3-EX producers to run pre-recorded video to air, while the producers set up for subsequent shots, and so on. On its own I also use the iPad and Xenyx to produce simple audio streams mixing music and commentary - I regularly use this setup for TheThursdayNightShow (a hobby Internet radio station I kicked off a few years ago to indulge a personal passion for music!).
So this rig represents a relatively sophisticated shoot. In the preceding example, while orientated around drone racing, the shoot could equally be a five- camera shoot of a conference, or a small sports event, etc. The principles remain the same.
For the drone race, we were producing the output for both a large projection screen, and for YouTube Live. The VR-3EX outputs video to the laptop at 640 x 480 using a UVC (USB video capture) output. While a little limiting in terms of quality, the drones in this instance all produce 640 x 480 video, so we worked to that as the base standard. By today's 4k and 8k standards, 640 x 480 is a small resolution and this may feel somewhat constrained, but our audience was small and purely online. Keeping the quality low like this means a reasonably good image can be produced with a contribution feed of around 1Mbps - great for streaming using 4g as we were on the day. Also this means that the viewers can get a great quality image on smartphones and tablets, and other small screens, while not requiring a high-capacity connection - meaning better worldwide engagement from those in places with limited connectivity. This was perfect for our audience.
At the other end of the spectrum I have used almost the same rig for producing live coverage of Parliament online - the annual meeting in the UK Parliament focusing on Internet governance issues. Again the desired quality target was good enough rather than HD, so this rig again proved its flexibility.
And the best thing about it is I can carry it all in two cases and a backpack - which makes getting onsite easy, and allows me the flexibility to take public transport or taxis without too much strain.
It is also good to note that the market is ever evolving. While id3as - my company - tailor-makes platforms for specific operations, I always explore general-purpose tools. The traditionally limited technologies such as the Flash Media live encoder only facilitate single-channel encoding on a single machine. However, on the laptop I have recently been having a lot of success with some open source software called Open Broadcaster Software. While there are
Figure 3.2 Streamstar.com's webcast case-All in One: Great for Sports.
some complexities in ensuring that a laptop can acquire multiple video capture sources, as soon as you move to a small ITX computer you can add any number of different types of capture cards. Essentially this is how we are able to produce live multi-camera mixing with almost all the capabilities seen in Figure 3.1, but using a single computer. This system is under development at the moment, and I will be publishing an article later in the year (which may be available on StreamingMedia's website by the time the book is published) demonstrating this approach. Ultimately that option will be considerably cheaper, although until it is used in production I cannot comment on its suitability for real-world production purposes.
Of course, there are many other tools in the market, ranging from the extensively used software-only Wirecast, to Streamstar's (Figure 3.2) and Livestream's, all in one portable webcast cases that offer compact and excellent functionality, including features such as slow motion and instant replay, and combine titles and graphics all in a single unit.
What suits you will be down to your budget. For high-end production, it is usually best to simply hire kit in - in real terms, buying a specific rig for high- end production will call for considerable capital expenditure, lead to high maintenance costs, and most likely will date quickly - those very high-end features will be superseded before you amortize the capital outlay, so you need to make more and more capital outlays to compete for high-end feature- focused events: it is normally the case that in hiring such a kit, and passing the cost to each customer based on the specifics they require, you will see a lower operating costs between events - although obviously if your client provides regular work, this can change dramatically in favor of owning the technology yourself and including the operating overhead of maintenance, etc., in your service fees.
Every gig, every rig, and every opportunity is different: what suits your situation will be unique.