Friday, 13 March 2015

The future of TV is an app !!

New OTT app-based video services are set to flood the market in 2015, pushing more high-quality video to the mobile masses.

As this happens, viewing share will shift from television to devices like smartphones and tablets, both within and beyond the home. This means not only a turn from legacy to broadband delivery, but entails a fundamental transformation of the viewing experience; a shift from social viewing in front of large-screen televisions to solitary viewing in front of smaller screens. The impact on the platform viewing mix will be profound.

Sunday, 2 February 2014

Understand your subnets.....

Hello one and all. 

It has been a long time since my last post. So here is another to review. 

Now this post is less broadcast news and views. More of a networking type post. 

I wonder how many of you actually pay attention to the network infrastructure in your home or workplace. Not many I bet :) You may not need to care, and are happy just plugging in a cable to a socket and for everything to be working. But what if everything is not aright? Plus you may want to understand how your network hangs together. 

Today I wanted to pass along some knowledge and tips and for working out your sub networks and hosts in IPV4.  It is fair to say that at home you are probably (99.99% sure) using a Class C Private IP address, starting 192.168.x.x with a CIDR of /24? Am I right? Now in the home using a class c ip address and a /24 is fine, so long as you only need to have 254 devices, maximum,  connected to your home network. Now you think, actually I am never going to need to that many IP addresses in my home, so no worries.  

Lets count the ones we have now on a typical home network:
Network 192.168.1.0 /24

Well, I am counting up to around 20 devices that require an IP address, that is before your friends and family come over with their phones and tablets grabbing your bandwidth........ 

So in hindsight I think I am OK to use a Class C IP address and 255.255.255.0 subnet. Even if the Fridge, Microwave and Vacuum all need a connection in the future I think we will be OK. 

So what are this classes and subnets that I mention.

Well lets start on the class. An IP address is made up from 4 Octects x.x.x.x each Octect will have 8 bits of either 0 or 1. 


Now these address are split into classes. If we look at the first Octect we can tell the class.

1 - 126 Class A
128 - 192  Class B
192 - 223 Class C

Plus we have some others kicking around near the end here

224 - 239 Class D reserved for Multicasting
240 - 255 Class E reserved for future = Experimental Address 

The default sub net masks for these networks are 
Class A - /8    8 network bits 24 host bits.
Class B - /16
Class C - /24 

Within our home and office environment, we will be using Private IP address. Ranges of address that can be used within your environment. These are 

Class A 10.0.0.0        -   10.255.255.255  (10/8 prefix)
                  Class B  172.16.0.0      -   172.31.255.255  (172.16/12 prefix)
                                     Class C 192.168.0.0     -   192.168.255.255 (192.168/16 prefix)


Lets look at how we can sub net our home network, not something you will or want to do, but you will gain an understanding of what is occurring.

Our current configuration is a CIDR / 24 sub net, or a 255.255.255.0, Lets work out what that gives us :

Our Network ID is 192.168.1.0
The first host IP is 192.168.1.1 (generally your home router.)
Our last host IP on this network is 192.168.1.254
And our broadcast address on this network is 192,168.1.255


Right, now I have work out that I only have the need to have 20 to 25 devices connected to my network so i do not need a /24 subnet.

Lets work out what we do to achieve a subnet that would fit this number of devices into a sub network of a smaller size than 254 hosts. 

Note a 1 in the Octect is a Network Bit and a 0 represents a host bit. 

To work this out we review the last Octect, which currently set to 
128 64 32 16 8 4 2 1
0 0 0 0 0 0 0 0

Giving us 254 hosts.
But if we change the network and host bits in this Octect to this
128 64 32 16 8 4 2 1
1 1 1 0 0 0 0 0

Giving us an Octect 224 value or this can be shown as /27
We can work out the following 

How many hosts will this give us?
Well using this formula we can say 2 to the power of 5 (5 Zeros, or host bits) = 32 - 2 = 30. 

How many Subnetworks will we get?
2 to the power of 3 = 8, we will get 8 sub networks using this mask in this class c example. 

What are these valid subnets ?
256 - 224 = 32, so every 32 we have a new sub network. 

What is the host range?
Well that will be 192.168.1.1 to 192.168.1.30 with a Network ID of 192.168.1.0 and a broadcast address of 192.168.1.31.

The sub networks would be overall :
Network ID 192.168.1.0 192.168.1.32 192.168.1.64 192.168.1.96 192.168.1.128 192.168.1.160 192.168.1.192 192.168.1.224
1st host 192.168.1.1 192.168.1.33 192.168.1.65 192.168.1.97 192.168.1.129 192.168.1.161 192.168.1.193 192.168.1.225
last host 192.168.1.30 192.168.1.62 192.168.1.94 192.168.1.126 192.168.1.158 192.168.1.190 192.168.1.222 192.168.1.254
Broadcast  192.168.1.31 192.168.1.63 192.168.1.95 192.168.1.127 192.168.1.159 192.168.1.191 192.168.1.223 192.168.1.255


I hope that this makes sense, if you have never come across networking before I hope that you learned something. If you are not new to networking you should know this already, but you may have worked out your sub nets in a differing manner.



Sunday, 8 December 2013

IPV6 Explained.

First of all apologies for the delay in posting another topic on this blog. I have been somewhat engaged on other items.

I will publish another post in the week covering the topic of OTT or Over The Top delivery. A very interesting subject indeed, one that needs to be be discussed as many broadcasters are suffering the dilemmas and issues of how to cover OTT.



In the meantime, some of you may know that I am a network guy. Recently i was asked to explain IPV6. Now I have been exposed to IPV6 in small amounts due to network labs and trials etc. But we have all been slightly protected from having to configure and use IPV6 in anger. Good old IPV4 still works in most networks found on premises.

So here goes :) Before we do i can recommend that you download the BitCricket Sub net IPV4 / IPV6 calculator and or another. Very useful. Sub Net Calc

Increasing the IP address pool was one of the major forces behind developing IPv6. It uses a 128-bit address, meaning that we have a maximum of 2¹²⁸ addresses available, or 340,282,366,920,938,463,463,374,607,431,768,211,456, or enough to give multiple IP addresses to every grain of sand on the planet. So our friendly old 32-bit IPv4 dotted-quads don't do the job anymore; these newfangled IPs require eight 16-bit hexadecimal colon-delimited blocks. So not only are they longer, they use numbers and letters. At first glance, those mondo IPv6 addresses look like impenetrable secret code:

2001:0db8:3c4d:0015:0000:0000:abcd:ef12

Under IPv4 we have the old familiar unicast, broadcast and multicast addresses. In IPv6 we have unicast, multicast and anycast. With IPv6 the broadcast addresses are not used anymore, because they are replaced with multicast addressing.

IPv6 Unicast

This is similar to the unicast address in IPv4 – a single address identifying a single interface. There are four types of unicast addresses:

  • Global unicast addresses, which are conventional, publicly routable address, just like conventional IPv4 publicly routable addresses.
  • Link-local addresses are akin to the private, non-routable addresses in IPv4 (10.0.0.0/8, 172.16.0.0/12, 192.168.0.0/16). They are not meant to be routed, but confined to a single network segment. Link-local addresses mean you can easily throw together a temporary LAN, such as for conferences or meetings, or set up a permanent small LAN the easy way.
  • Unique local addresses are also meant for private addressing, with the addition of being unique, so that joining two subnets does not cause address collisions.
  • Special addresses are loopback addresses, IPv4-address mapped spaces, and 6-to-4 addresses for crossing from an IPv4 network to an IPv6 network.

Multicast


Multicast in IPv6 is similar to the old IPv4 broadcast address   a packet sent to a multicast address is delivered to every interface in a group. The IPv6 difference is it's targeted   instead of annoying every single host on the segment with broadcast blather, only hosts who are members of the multicast group receive the multicast packets. IPv6 multicast is routable, and routers will not forward multicast packets unless there are members of the multicast groups to forward the packets to. Anyone who has ever suffered from broadcast storms will appreciate this mightily.

Anycast


An anycast address is a single address assigned to multiple nodes. A packet sent to an anycast address is then delivered to the first available node. This is a slick way to provide both load-balancing and automatic failover. The idea of anycast has been around for a long time; it was proposed for inclusion in IPv4 but it never happened.

Several of the DNS root servers use a router-based anycast implementation, which is really a shared unicast addressing scheme. (While there are only thirteen authoritative root server names, the total number of actual servers is considerably larger, and they are spread all over the globe.) The same IP address is assigned to multiple interfaces, and then multiple routing tables entries are needed to move everything along.

IPv6 anycast addresses contain fields that identify them as anycast, so all you need to do is configure your network interfaces appropriately. The IPv6 protocol itself takes care of getting the packets to their final destinations. It's a lot simpler to administer than shared unicast addressing.


Address Dissection

2001:0db8:3c4d:0015:0000:0000:abcd:ef12
__________|____|___________________
global prefix      subnet  Interface ID

The prefix identifies it as a global unicast address. It has three parts: the network identifier, the subnet, and the interface identifier.

The global routing prefix comes from a pool assigned to you, either by direct assignment from a Regional Internet Registry like APNIC, ARIN, or RIPE NCC, or more likely from your Internet service provider. The subnet and interface IDs are controlled by you, the hardworking local network administrator. :)

You'll probably be running mixed IPv6/IPv4 networks for some time. IPv6 addresses must total 128 bits. IPv4 addresses are represented like this:
0000:0000:0000:0000:0000:0000:192.168.1.25

Eight blocks of 16 bits each are required in an IPv6 address. The IPv4 address occupies 32 bits, so that is why there are only seven colon-delimited blocks.

The localhost address is 0000:0000:0000:0000:0000:0000:0000:0001.

Naturally we want shortcuts, because these are long and all those zeroes are just dumb-looking. Leading zeroes can be omitted, and contiguous blocks of zeroes can be omitted entirely, so we end up with these:

2001:0db8:3c4d:0015:0:0:abcd:ef12
2001:0db8:3c4d:0015::abcd:ef12
::192.168.1.25
::1

An ipv6calc is invaluable for checking your work. Suppose you're not sure if your compressed notation is correct. ipv6calc displays the uncompressed notation:






Wednesday, 13 November 2013

Another method of watching content is soon to arrive......Or is it??

is taking steps toward releasing a video-streaming device in time for the festive period , according to people briefed on the company's plans.

The set-top box, which would pit the online retailer against a host of established rivals, is a small device that resembles a Roku Inc. player and is similarly styled as a platform to run apps and content from a variety of sources, these people said. It would also serve as a delivery vehicle for Amazon's existing streaming video service—available as part of its Prime membership—which competes with Netflix Inc.






Why would Amazon do this? The Wall Street Journal lines out several arguments, 

This provides direct-to-TV hardware to enable Amazon’s OTT TV service to bypass the currentreach of secondary platforms (like game consoles) on which Amazon’s apps are preloaded.

I would counter this statement and that it is rubbish. As a consumer I have to go and buy another piece of hardware to receive content. Why? I am sure that Netflix identified this and decided to piggy back on the hardware manufacturers to increase the reach of the product. The results of this decision speak for themselves. Just ask incumbent pay-TV players how nice it would be not to have to subsidize their set-top boxes And that’s what this will end up being, a subsidy, as Amazon sells its hardware at break-even or below so that it can sell its services.

It could provide a physical platform through which Amazon can sell cloud-based gaming.

I would counter this statement and point out that you can do this on other hardware and does not need to be proprietary hardware.

Amazon could provide “a truly complete package of streaming sources…that would be compelling.”

With the exception of maybe Apple, every Every hardware and service vendor in the net-to-TV space seeks to offer a “complete package” of streaming sources. That’s not the issue. The issue is how creating a meta-service above this diverse set of content to help organize and improve the user experience (I’m thinking here of improvements in search and UIs). Roku has done well executing on this vision, and that’s what takes: execution. And let’s be honest: Amazon may allow in other third-party apps, but will most certainly preference its own content.

What do you think?? 


Me.... I thinkAmazon should toss the hardware idea and vigorously pursue a Netflix-like embedded app strategy. It must focus its efforts on differentiating its service from Netflix, not wasting millions on rolling and supporting a new iSTB. Keeping the video service tied in some way to the larger mass merchandising market vision is critical, and this was accomplished in part by tying the service to Prime membership (very smart move).

Tuesday, 22 October 2013

More Good News for NewTek





Source: NewTek

At the end of 2012, the UK National Lottery operator Camelot, in partnership with the BBC, which broadcasts the draw, updated the draw presentation formats. This change triggered a search for new production equipment and techniques.

The Saturday evening show is televised and incorporates live performances, as well as the main draws. The challenge was to move the other draw results shows to an on-line format that provides lots of sizzle, a fast turnaround and plenty of branding, all produced reliably, but with minimum staffing and equipment.

“Most people just want to quickly catch up with their results in the week, so we decided to deliver short, succinct results shows via our YouTube channel, website and social media outlets,” explains Max Tilney, Channel Editor, Digital Content, Camelot UK Lotteries Limited.

The new online show format, shot at Pinewood Studios, is produced using a TriCaster live production system that has introduced a great deal of automation and simplification to the production process, at an affordable cost.


 “We wanted a streamlined, efficient and reliable workflow, as we only have one chance to get each draw right,” Max says. “When it came to choosing a production system, we went through a rigorous review process together with the Pinewood team and looked at many products on the market. TriCaster was the clear choice to give us a complete standalone production operation, with all the features we needed and more. All within our budget.”

Since January 2013, the Initial Endemol production team has used TriCaster to produce the EuroMillions results on Tuesdays, the Lotto and Thunderball results on Wednesdays, and the EuroMillions and Thunderball results on Fridays.

IMG 1903-cFor the studio-based Lotto and Thunderball shows, five Panasonic HE60 remote head cameras are pre-set to capture wide shots and close-ups of the lottery machines, and the numbered balls as they are drawn. These locked-off feeds are then sent to the TriCaster. A PC-based graphics system linked to the ball machines and the TriCaster automatically produces a graphic for each ball, and triggers the correct voice-over to announce which ball has been drawn. An operator vision-mixes the camera feeds and graphics, and then creates a second pass to add additional sound bites before preparing the file for export and upload to YouTube.

The results show is uploaded approximately an hour after the draw, and is subsequently embedded into the National Lottery website and social media sites.
Chris Jones, Senior Producer, Initial Endemol, explains, “The YouTube feed is a one-off continuous as-live presentation, and we wanted to make it as quick as possible, rather than building a slow process with an edit. Having the TriCaster has enabled us to create as near to a finished product as possible in one pass.”

IMG 0141-cThe EuroMillions draws are produced in a rather different way, as the satellite feed is sent from Paris, to all countries that participate in the draw. The clean feed goes directly into Pinewood’s TriCaster and the system concurrently applies a crop to remove the Paris show’s graphics. As Chris says, “This is a great feature of the TriCaster, as it shaves considerable time off our turnaround, and the editor can then easily apply our local branding, graphics and voice-overs before exporting. The TriCaster has more than proved its worth for this project, and it has shown us that there are lots more things we could do with it down the line. It is fantastically versatile and has some brilliant features, like being able to take in and output lots of different video formats, simultaneously.”

Paul Darbyshire, Broadcast Director, Pinewood Studios, says, “The TriCaster does exactly what we employed it to do, quickly and simply. The production team is very happy with it, as it has enabled them to automate a large part of the process and minimise turnaround times, all within their set budget. Its flexibility and portability offer the potential to use it for future programming outside the studio environment for multi-camera events.”





Thursday, 17 October 2013

Big Data for Broadcasters




3 questions that I want to answer here:


  1. What is 'Big Data' ?
  2. How big is big to justify it being labelled 'Big Data'?
  3. What does Big Data mean for Broadcasters?

So what is Big Data? 

Well lets start at looking at the book of knowledge to enlighten us. 

Big Data is the term for a collection of data sets so large and complex that it becomes difficult to process using on-hand database management tools or traditional data processing applications. 

From this definition we are saying we have so much data that current conventional technologies are not any use when it comes to cataloging, indexing and reviewing this data?

The book of knowledge continues... The challenges include capture, curation, storage, search, sharing, transfer, analysis and visualization. The trend to larger data sets is due to the additional information derivable from analysis of a single large set of related data, as compared to separate smaller sets with the same total amount of data, allowing correlations to be found to "spot business trends, determine quality of research, prevent diseases, combat crime, and determine real-time roadway traffic conditions.

As of 2012, limits on the size of data sets that are feasible to process in a reasonable amount of time were on the order of exabytes of data. Scientists regularly encounter limitations due to large data sets in many areas, including meteorology, genomics, connectomics, complex physics simulations.

I take the wiki extract as a summary of how Big Data came about and the challenges that certain industries and arenas face when trying to house keep this data and produce tangible trends and analysis. 

But are we saying that broadcasters are being challenged with Big Data issues? Do they really have so much data that it is to be classed as Big Data? I am not an expert, just an observer. But I highly doubt that current broadcasters face this issue or am i being incredibly short sighted, probably you tell me?? One thing is true however, broadcasters do need use data collected in a more efficient and intelligent manner.

I found this posting incredibly useful when it comes to further describing Big Data. 


The Original Big Data
Big Data as the three Vs: Volume, Velocity, and Variety. This is the most venerable and well-known definition, first coined by Doug Laney of Gartner over twelve years ago. Since then, many others have tried to take it to 11 with additional Vs including Validity, Veracity, Value, and Visibility.

Big Data as Technology
Why did a 12-year old term suddenly zoom into the spotlight? It wasn’t simply because we do indeed now have a lot more volume, velocity, and variety than a decade ago. Instead, it was fueled by new technology, and in particular the fast rise of open source technologies such as Hadoop and other NoSQL ways of storing and manipulating data.

The users of these new tools needed a term that differentiated them from previous technologies, and–somehow–ended up settling on the woefully inadequate term Big Data. If you go to a big data conference, you can be assured that sessions featuring relational databases–no matter how many Vs they boast–will be in the minority.

Big Data as Data Distinctions
The problem with big-data-as-technology is that (a) it’s vague enough that every vendor in the industry jumped in to claim it for themselves and (b) everybody ‘knew’ that they were supposed to elevate the debate and talk about something more business-y and useful.
Here are two good attempts to help organizations understand why Big Data now is different from mere big data in the past:

Transactions, Interactions, and Observations. 
This one is from Shaun Connolly of Hortonworks.  Transactions make up the majority of what we have collected, stored and analyzed in the past. Interactions are data that comes from things like people clicking on web pages. Observations are data collected automatically.

Process-Mediated Data, Human-Sourced Information, and Machine-Generated Data. 
This is brought to us by Barry Devlin, who co-wrote the first paper on data warehousing. It is basically the same as the above, but with clearer names.

Big Data as Signals
This is another business-y approach that divides the world by intent and timing rather than the type of data, courtesy of SAP’s Steve Lucas. The ‘old world’ is about transactions, and by the time these transactions are recorded, it’s too late to do anything about them: companies are constantly ‘managing out of the rear-view mirror’. In the ‘new world,’ companies can instead use new ‘signal’ data to anticipate what’s going to happen, and intervene to improve the situation.

Examples include tracking brand sentiment on social media (if your ‘likes’ fall off a cliff, your sales will surely follow) and predictive maintenance (complex algorithms determine when you need to replace an aircraft part, before the plane gets expensively stuck on the runway).

Big Data as Opportunity
This one is from 451 Research’s Matt Aslett and broadly defines big data as ‘analyzing data that was previously ignored because of technology limitations.’ (OK, so technically, Matt used the term ‘Dark Data’ rather than Big Data, but it’s close enough). This is my personal favorite, since I believe it lines up best with how the term is actually used in most articles and discussions.

Big Data as Metaphor
In his wonderful book The Human Face of Big Data, journalist Rick Smolan says big data is “the process of helping the planet grow a nervous system, one in which we are just another, human, type of sensor.” Deep, huh? But by the time you’ve read some of stories in the book or the mobile app, you’ll be nodding your head in agreement.


Big Data as New Term for Old Stuff
This is the laziest and most cynical use of the term, where projects that were possible using previous technology, and would have been called BI or analytics in the past have suddenly been rebaptized in a fairly blatant attempt to jump on the big data bandwagon.


How big is big then?

I think the answer here, is it depends. Or nobody actually knows how much data is required to categorize it as big data.  A rather loose and fuzzy term to associate. Secondly how is big data handled and managed? Who is currently doing big data???? I think this image sums it up for me :)

 

What does Big Data mean for Broadcasters?


With new / current broadcast delivery methods, broadcasters actually do have a a great deal of metric and analytic data combined with global and regional torrent activities (see Netflix !How they judge what content to publish). Linked with social media activity we end up with a number of data sets that need to be combined, indexed, reviewed, cultivated etc to provide signals and opportunities for broadcasters. 

In summary Big Data is the new buzz word and is a broad brush in my humble opinion. 

James

Thursday, 10 October 2013

Future of Broadcast TV

The words of Joel Espelien from the TDG research group:


One nice perk about being an analyst covering the future of TV is that I get to talk to lots of people across the TV ecosystem about their challenges and concerns. Put simply, what keeps TV guys up at night?

In recent weeks, many of my client conversations have coalesced around a single theme: the future of broadcast TV.

What do we do with broadcast?
How do we retain the existing broadcast audience?
How do we win younger viewers back?
How do we make broadcast relevant again?
So what is the future of the broadcast? Two main points.

Broadcast is evolving from a technology into a content marketing concept.
The technology industry’s concept of the “product launch” provides the template for what the future of broadcast will look like.
Broadcast is about Marketing, not Technology.

If you’ve read this far into one of my opinions, you are aware that I believe the future of TV is an app. Put another way, the linear broadcasting (digital, analog, or otherwise) model of TV is being supplemented (in the future, perhaps even overtaken) by an app model based upon on-demand streaming to downloadable interactive software clients (apps) on every device with a screen. For the full version of this argument, please check out my report The Future of TV: A View from 2013.

Today the clearest example of this phenomenon is Netflix, which doesn’t broadcast anything. Nevertheless, the marketing function of broadcasting (i.e., getting new content in front of viewers at a single point in time) is highly relevant to Netflix. The Associated Press posted a great article last summer upon the release of Netflix’s new original series, Orange is the New Black. In a nutshell, after weeks of heavy promotion to its users, Netflix posted the first episode of the new series at midnight on July 11, 2013. Note that this was not a live stream. Netflix simply published the episode for viewing anytime on-demand. Even so, and despite the late hour, people watched in droves. Within thirty minutes after its release, Orange is the New Black Episode 1 was already the third most popular show on all of Netflix! Chris Jaffe, Netflix VP of Product Innovation, summed up the phenomenon perfectly, “This is Silicon Valley’s equivalent of a midnight movie premiere in Hollywood.”

Broadcast = Launch.

In the old world, broadcast meant live programs, simultaneously both the beginning and the end of a viewing opportunity. Now you see it, now you don’t. That’s why they invented reruns and DVRs.

Product launches are a very distinct concept. There is a beginning, which may be (but doesn’t have to be) “live.”  More importantly, there is no end. People who miss the live broadcast are not excluded from enjoying the experience at a later time.

The purpose of a launch is to create excitement and community, as the most passionate fans have the same experience at more or less the same time. The new Grand Theft Auto 5 video game was launched on September 17, 2013. It did $800 million in revenue on its first day. It’s been on sale ever since – just saw a stack of them at my neighborhood Walmart on Saturday. Now that’s a launch.

Of course, unlike movies or video games, TV shows consist of multiple episodes. In the new world, companies like Netflix can release all the episodes at once (as it did with Arrested Development), or one at a time (as it did with Orange is the New Black). A variety of combinations are possible.

Releasing one new episode per week feels like traditional TV.
A new episode every night for two weeks feels like an old-school miniseries (remember Roots?).
Posting a whole season at once is reminiscent of DVDs.
In any case, if distributors are serious about embracing the launch model, the episodes need to go up and stay up rather than disappearing after 24, 48, or 72 hours.

Conclusion

As an industry concept, “broadcast” is becoming less and less useful. As Netflix has shown, it is possible to have compelling launch events around new TV shows without broadcasting (and even without live streaming). If the future of TV is indeed an app, the future of broadcast TV is to serve as a launch pad for new TV content.

The Future of Live and On Demand TV

The global TV industry has reached a tipping point in its evolution that will witness changes in viewer behaviors and the appearance of new business models as the distribution of TV content continues to change, according to a new report from IDATE.

The chart depicts the growth of video service revenue on fixed and mobile networks among so-called EU-5 nations, including Germany, France, the U.K., Italy and Spain (million EUR). Source: IDATE.

The report, “World TV & New Video Services Markets,” lays out different scenarios for the future of the industry. Its business-as-usual scenario predicts live viewing will be overtaken partially by on-demand viewing. Additionally, piracy will create obstacles for the transition from viewing content playing back from physical media to online streaming. The report also forecasts under this scenario that increased competition in the pay TV market will impact pricing and that ad rates for live TV will fall while growing for video on demand.

The report says that if the future follows the business-as-usual path, the video market on fixed and mobile networks worldwide will grow by an average of 3.2 percent per year from 2013 to 2025. That figure includes average growth of 2 percent for live TV and 14 percent annual growth for on-demand service.

Growth in developed markets will be much lower, the report says. For example, Europe’s top five markets (EU-5) — Germany, France, the U.K., Italy and Spain — will see annual growth at 1.6 percent from 2013 to 2025, with the live TV market — including broadcasters’ catch-up TV services — declining 0.7 percent. On-demand services, however, will experience an average annual increase of 18.5 percent.

The other scenarios portrayed in the report include “the music industry syndrome” and “the best of both worlds.” Under the former scenario, video services exist in a disruptive environment that will see an annual shrinking of 0.8 percent in the EU-5 nations. Further, the decline will not be offset by on-demand services, which will fall by 4 percent annually.

The latter scenario predicts an average annual growth of 3.9 percent, including 2 percent growth for live broadcasting services.

Source : Broadcast Engineering

Tuesday, 8 October 2013

UHD Logo Programme

In June of this year, DIGITALEUROPE announced the start of its work to develop a UHD logo programme for Ultra High Definition consumer equipment. 

DIGITALEUROPE announces its initial findings on the baseline capabilities of UHD Consumer Displays:
Native Resolution: 3840 x 2160 Pixels
Aspect Ratio: 16:9
Colorimetry: ITU-R BT.709
Colour Bit depth: 8 Bit
Frame Rate: 24p/25p/30p/50p/60p
Audio: PCM 2.0 Stereo
DIGITALEUROPE’s ‘Beyond HD’ Group have concluded that these parameters will form a baseline for Consumer UHD Displays from their first market launches for the short to medium term.
As display technology will continue to evolve, DIGITALEUROPE does not want to speculate beyond these baseline UHD characteristics until the consumer uptake of the new UHD products and services is understood.
John Higgins, Director-General of DIGITALEUROPE notes: “DIGITALEUROPE’s membership contains all of the major TV manufacturers and as such, is in a unique perspective to comment on how the nascent UHD market will develop. While many industry stakeholders speculate on the future UHD market, DIGITALEUROPE feels that the time is right to announce these initial findings to give some guidance to the market on short to medium term UHD Consumer Display capabilities.”

4K about to jump the chasm?


4K here to stay and is the next kid on the block. 


The next big thing, 4K consumer units are now coming down in price and early adopters now look certain to be joined by the early majority in the next 6- 12 - 18 Months.  A market place edging towards the 4K UHD trend. 

Really!!, do we really believe that in the year that we will know someone with a 4K TV?  It is an extremely fast paced industry, with the convergence of broadcast and technology arenas the growth and evolution is at warp speed.



So with consumer units coming down in price (still expensive for my liking), slowly but surely. Then we need to start seeing decent 4K content.  From live events, such as sports and concerts to high profile TV serials. 

But to cover the gaps from producing and delivering true- pure 4K content we need a filler to get us over the hump.  

We have seen this before when we moved from SD to HD. What occurred to the existing SD content???




That's right it was up-converted to High Definition. So I can see the same happening here with the trillion minutes of 2K HD content being up-converted to 4K. 

The good bet is that we will see a number of 2K to 4K up converts taking place. Simply taking a 2K pixel and quadrupling it in size. But this adds no quality to the newly generated 4K image and quite frankly may even reduce the image quality and look really bad. So a major degree of enhancement or digital manipulation will need to take place during the conversion process.



On the flip side manufacturers have begun to equip standard 1080P players with 4K upscaling capabilities. While this is definitely an improvement over imagery that has not been upscaled, it still does not compare to true 4K resolution. Just as with DVD players that upscaled their content to 1080P, upscaling noticeably reduces the appearance of blockiness and jagged edges, but falls short when it comes to depicting more detail.

You have to read Scott Wilkinson's piece at AVS FORUMS which discuss's and reviews the Technicolor and Marseilles Demo "4K Image Certified" Upscaling.

According to Technicolor :

the TECHNICOLOR IMAGE CERTIFICATION DELIVERS HIGH-QUALITY 4K CONTENT

4K TVs can replicate the Hollywood movie experience with a stunning image at home. 

The problem is not having 4K content.

Now, imagine seeing a standard 1080 Blu‐Ray disc movie upgraded to a stunning 4K image.
Technicolor is now making all of this a reality.

Up-convert processing enables makers of Blu-ray players and set-top boxes to solve the 4K content dilemma.













Translate

Search This Blog

Wikipedia

Search results