Hands-on review: Cambridge Audio Melomania 1 wireless headphones

August 21st, 2019 no comment

These little marvels are by no means the first Bluetooth in-ear wireless headphones on the market, but they’re the first such earbuds from Cambridge Audio. Once again, the company has come up trumps with a good hi-fi product that’s great value for money.

Its driver diaphragms use graphene for strength and flexibility. The result is a clean sound from ridiculously small bullet-style ear buds. Bluetooth 5.0 gives a reliable connection while AAC and aptX codecs give the best possible wireless connection quality.

What’s equally impressive is that the small buds cram in a nine hour battery life. They can be stowed in the charging case, which is barely bigger than a matchbox, for a total battery life of 45 hours. You’ll always want to use the case because the earbuds are so tiny you’d lose them without it.


Hand holding Cambridge Audio Melomania wireless headphones

Image credit: Cambridge Audio

Setup was simple: the earbuds each sync to your phone via Bluetooth and find each other with ease, to work in tandem. From then on, they automatically power up and connect when you take them out of the charging case. There’s no special app, which means one less thing to do, but it also means you can’t fine tune the headphones’ EQ to tailor the sound.

Fit is fairly good. They come with four sets of tips: three sizes of silicone tips plus one pair of memory foam tips. Between these it’s possible to get a good fit, although changing them over the memory foam pair proved fragile and tore slightly. There’s nothing to secure them in place for sports, but if you’ve found the right fit you can at least run for the bus without fear.

Sound quality is impressively open and detailed, but the design lacks sonic isolation from the outside world. This makes them great for adding a soundtrack to daily life without blocking it out but they can’t stand up to loud environments. The earbuds feature a noise-cancelling mic for phone calls and voice assistant but no active noise cancellation to help combat background noise disturbing listening.

Finally, controls are very impressive: instead of multiple tiny buttons that are hard to tap, the end of each earbud acts as a single big button. As mentioned, the headphones pair with your phone as soon as you take them out of the charging case. Then a single tap on either earbud plays or pauses the music. A double-tap left or right skips backwards or forwards a track. A long press turns volume down or up. And a slow double-press summons the voice assistant. It’s not uncommon to hit pause by mistake as you put them in your ears but aside from that the controls work very well.


Cambridge Audio Melomania wireless headphones product image

Image credit: Cambridge Audio

In all, it’s an impressive package at a keen price and makes a strong case for wireless listening if you’re not already a convert. The Melomania 1s are so tiny that you can pocket the charging case and carry it everywhere, giving everyday a soundtrack.

£119.95 cambridgeaudio.com

Alternatives

Apple AirPods

Designed to work with iPhone, Apple Watch, Siri and more. Battery life is 5 hours (more than 24 hours including the juice in the charging case). The £199 version has a wireless charging case so you can simply lay it on a Qi charging mat.

From £159 apple.com

Sony WF-1000XM3

Impressive musicality and, unusually, noise cancelling make these the tiny headphones to beat in terms of pure sound quality but they’re pricey. Battery life is 6 hours (24 hours including charging case).

£220 sony.co.uk

Bose SoundSport Free

A comfortable design and water-repellent mesh over the open ports make these a good choice for sport. Battery life is 5 hours (15 hours including charging case).

£179.95 bose.co.uk

National Grid blackout report ‘fails to get to heart of problems’

August 21st, 2019 no comment

This week, National Grid attributed the events of 9 August to a lightning strike on the electricity network north of London that was followed seconds later by outages at Hornsea offshore windfarm and Little Barford gas power station. This created what it describes as “an extremely rare and unexpected event” in which the level of backup power available was insufficient and around five per cent of demand was disconnected.

The IET welcomed the report, but warned that it fails to get to the heart of problems that run much wider than the electricity grid alone. “This incident raises serious questions about the resilience of the UK’s energy and related infrastructure,” it said in a statement. “[It] highlights the desperate need for greater coordination across what are increasingly complex and interrelated energy, transport and communications systems.”

The IET and independent research group the Energy Systems Catapult have been highlighting what they see as a lack of ‘whole-system’ thinking within the power network through their Future Power Systems Architecture collaboration. The FPSA initiative, which is identifying new capabilities that will be needed in 2030, considers the traditional power system together with installations, appliances and devices on the customer side of the meter. It is also looking at how it interacts with other energy vectors such as transport and heat.

Commenting on the National Grid findings, the IET claimed that the electricity system has changed so much since existing governance procedures were established in the 1980s that they are no longer fit for purpose.

“We now have a highly complex system of which the National Grid is only one part, interconnected physically and through data and information flows to many other systems (such as the Network Rail system and subsystems within it),” it said. “The impact of this lack of whole systems thinking was highlighted by the large degree of disruption faced by rail passengers from what was a comparatively minor electrical incident, stranded because signalling systems and trains were not able to restart once power supplies had been returned to normal. This lack of understanding about impacts across other sectors and on individuals also meant hospitals, residents and business were all impacted.

“The technical governance of these complex systems requires a step change from today, it needs to be holistic, agile, flexible and embracing of the full range of system participants. Without that, we can expect to see more unexpected consequences across the whole system, as well as a failure to seize the benefits whole systems cooperation can bring, not just for major events on the National Grid, but also for the much more numerous power cuts experienced locally every day.”

Energy market regulator Ofgem has launched its own investigation, aimed at identifying what lessons can be learned to ensure and steps taken to further improve the resilience of Britain’s energy network.

As well as establishing whether any of the parties involved breached their licence conditions – focusing initially on National Grid’s requirements to hold sufficient back-up power to manage the loss of generation supplies and whether distribution network operators complied with their Low Frequency Demand Disconnection obligations – it will look at whether companies made the right decisions regarding the numbers and types of customers who were disconnected.

There will be discussions with the rail industry to understand why the drop in frequency on the energy network led to such significant disruption for passengers.

Jonathan Brearley, Ofgem’s executive director of systems and networks said it is important that the industry takes all possible steps to prevent a repeat of the events of 9 August: “Having now received National Grid ESO’s interim report, we believe there are still areas where we need to use our statutory powers to investigate these outages,” he said. “This will ensure the industry learns the relevant lessons and to clearly establish whether any firm breached their obligations to deliver secure power supplies to consumers.”

‘Making guitar pedals?! Freaking mental’ – Steve Bragg, Empress Effects, and the ZOIA

August 20th, 2019 no comment

Much like the twisting, surprising, occasionally frustrating evolutionary path of Empress Effects’ ZOIA multi-effects pedal itself, the genesis of this interview actually began nearly three years ago. Empress was releasing its Reverb pedal (it’s… a reverb pedal), so we began talking to Steve Bragg, Empress’ founder, the Emperor of Empress, if you will.

Various work delays, seismic life events (Bragg’s family headcount expanded by one), the inevitable forgetting, followed by the eventual remembering, saw the Reverb launch come and go, to be succeeded in the Empress line-up first by the Echosystem delay pedal and now by the all-encompassing, all-conquering ZOIA (pronounced ‘zoy-ya’).

And yes, while it runs contrary to E&T‘s house style, in this instance we are respecting Empress’ preferred all-caps approach for ZOIA, as Bragg asked us to do so “pretty Canadian please”. It’s hard to argue with such a polite request. “And if you want to refer to it in all caps vocally, all you have to do is yell it at the top of your lungs!”

Beginning in 2007 as a one-man operation, working out of Ottawa, Canada, with its famously ‘extremely orange’ and full-featured Tremolo pedal (which recently celebrated its 10th anniversary with a special limited-edition, sold-out reissue), since that time Bragg and Empress Effects has carefully grown both its product line-up and its staff roster.

Currently offering around 20 different effects and utility pedals, it was the public release of the ZOIA earlier this year that really blew the lid off Empress’ – and Braggs’ – hitherto untapped ‘mad scientist’ leanings.

Launched with a teaser video depicting a mysterious entity contained within a laboratory, ZOIA eventually escaped into the wild in mid-April 2019. Since then, the audio influencers of social media have been buzzing with new videos and demos.

Already well-known (and highly regarded) for its traditional guitar effects pedals – where one box essentially does one thing, e.g. tremolo, reverb, delay – Empress’ ZOIA is basically a modular synthesizer in pedal form. It takes some of the best Empress algorithms from existing pedals, such as the Reverb, and makes them available for use in any context that the user desires.

In this way, ZOIA can be used to create bespoke effects, synthesizers, MIDI instruments or entire virtual pedal boards, chaining the necessary ‘building blocks’ of sound together directly on the ZOIA. All programming is done on the pedal, via a grid of multicoloured buttons.

ZOIA also ships with many of the standard (guitar) effects preinstalled as existing ‘modules’ ready for use, such as a phaser. As Empress says, you don’t have to know that phasers are created with a series of all-pass filters. What ZOIA enables is the choice to simply play a pre-installed phaser; to build your own dream phaser design, taking that basic phaser and pimping it out to be exactly how you want it, and then take that phaser and layer an ambient reverb on top of it, add a stutter effect, maybe a resampler, dual loopers, one of which has a lo-fi delay and distortion on it, the ability to control the whole thing via MIDI or CV, then save the result as a single module, and then connect that module to another insane creation (or two, or three, or four) fresh from your brain, until a single guitar note sounds like the gargantuan distant reverberation of an alien electric orchestra tuning up somewhere beyond the Milky Way, a heavenly mesmeric signal beamed back to Earth via a broken satellite connection. Woah.

Plus, if things get too crazy, you can easily peel back all those layers and get back to the warmth and comfort of a good old-fashioned, straight-up, great-sounding phaser effect, no harm done. The only limit really is your imagination. As Empress says, ZOIA is an infinite trick pony.

How did Empress – or more to the point, Bragg – get to this point of easy-access aural insanity?

“The ZOIA is like a Pure Data or Max/MSP system in a pedal. It’s receiving a lot more attention than any of our other pedals ever did,” Bragg says. “Basically like Pure Data in a guitar pedal. I guess you could replace stuff [other effects pedals] with it, but I think you’d be disappointed to some extent. It shines a lot more as a tool for coming up with new sounds/processors, than as a virtual pedalboard.

“With only one knob, you don’t get the sweet immediate tweakability of a pedal board, but then I guess you could (as I have done) send a MIDI controller into it with tons of knobs, which would give you back some of that tweakability. Anyways, its fun!”

Fun, ZOIA most certainly is. Some people are even calling it “the world’s most powerful pedal”, given the infinite possibilities contained therein.

Unsurprisingly, a pedal this complex has taken a lot of development to get anywhere near to what Bragg considers right. Quietly unveiled at the music trade show NAMM in January 2018 – and instantly setting the internet’s audioheadz and gearslutz hearts aflutter – various mooted shipping dates came and went: “My predictions on when things will be ready are horrible”, acknowledges Bragg.

Despite now being on public release, the development of ZOIA continues. There’s no sense of ‘out the door and we’re done’ at Empress. A number of their pedals are regularly updated with new firmware, easily installed via the SD card slot round the back of the pedal, adding cool new features and quoshing bugs, the information garnered from user feedback. Even to reach the point of public release, ZOIA went through many months of beta testing, after many months of in-house refinement.

“We thought we had our hands full with our Reverb and Echosystem products. The ZOIA puts those to shame in terms of complexity,” Bragg recalls. “With the Reverb and Echosystem, there was hardly any dynamic memory allocation. With the ZOIA, we’re allocating and freeing memory all over the place!

“Half way through development we were running into a situation which I’m sure a lot of developers will be painfully aware of. We’d push some new feature to the main development branch and weeks later we’d notice some other feature wasn’t working properly, so we’d spend hours hunting some bug that crept in weeks ago. So painful just to think about it!

“So we bought ‘Test-Driven Development for Embedded C’ by Grenning. It took a bit of work to set up, but we’d never go back. We have a couple hundred unit tests installed, everything gets tested after we write a new feature. So we are confident that if a commit is on the main branch, it passes all our unit and integration tests”.

Even deciding what ZOIA should do out of the box posed Empress with a further conundrum, as Bragg explains: “A less technical but very important challenge we’ve had is what features to work on before release vs what features to leave until after release.

“ZOIA is the kind of product that is never finished. Our beta testing team had a lot of opinions on what features we should implement and what direction we should take the ZOIA in. We implemented a voting forum to get a general idea of what features were important. We then took those features and asked ourselves: Do we like this feature? How complex is it and how long would it take to implement? Is there a workaround if we don’t implement this feature? How useful is it?

“I think working in this way has streamlined our development and led to a product that will be useful for a lot of different people”.

As Bragg says, the germ of ZOIA originated in his use of the Pure Data visual programming language, developed by Miller Puckette in the 1990s for creating interactive computer music and multimedia works. Puckette’s work is also related to Max/MSP, another visual programming language for music, in which modules, routines and libraries can be connected and combined almost infinitely – limited primarily by the processing power of the host CPU.

Bragg spent many hours deep in Pure Data (PD), using a Monome (an interactive instrument that allows the user to define its function, company motto: ‘Sound machines for the exploration of time and space’), precisly to try and create a digital modular system that did away with the need to spend hours twiddling around in Pure Data.

“I love Pure Data, but since I program all day for work, the idea of creating patches in front of a screen doesn’t appeal to me,” Bragg says. “So the idea of ZOIA was to be able to create PD-like patches without a computer. In PD, you have objects; the analog in the ZOIA are modules. You place down modules on ZOIA’s grid and connect them together in whatever way you want. No need for a computer!”

Bragg pauses for a second, reflecting, before self-correcting. “So I lied when I said there was no need to use a computer with the ZOIA, because there are actually two processors inside the ZOIA. One handles the audio processing and the other is dedicated to the user interface. This is a similar design to our Echosystem and Reverb, but with the ZOIA the user interface takes on a lot more importance.

“The audio processor not only processes audio but receives user actions from the UI processor and sends it to the LED and screen data. The UI processor unpacks this data and sends it off to the LED drivers and OLED. It’s a nice decoupled system that is easy to test during development, which was one of our key concerns with creating the ZOIA”.

While Empress Effects is known mostly by guitarists for being ‘one of those cool boutique pedal companies’ – lucky as we all are to be living in a golden age of creative zeal, free expression and exceptional quality in the effects pedal sphere – increasingly Empress products have been finding their way into the set-ups of synth players, loopers, on top of mixing consoles in the studio. The effects are no longer ‘relegated’ to the floor, waiting to be stomped on. Some of Empress’ algorithms are so advanced and so refined that, hell yeah, why wouldn’t you use, say, the Reverb pedal on a send bus of your mixing desk, so that multiple channels in your mix can benefit from a beautiful ‘Plate’ sound?

With this in mind, we wonder what the approach is for new product development at Empress. What are the design goals and aspirations?

“My answer is so boring, but I guess we want people to at least consider the product among the best in whatever it’s doing,” Bragg says. “That’s been the goal for the first 10 years of Empress. Going forward, I think we’ll care less about that and focus more on making interesting products. At least personally, after doing this for 13 years, I don’t want to go back to working on another phaser.

“Of course, having said this, I’ll probably look back in 2050 and wonder why we did that whole line of phaser pedals!” Bragg adds, laughing.

What are the typical design, engineering, and prototyping stages? Is there even a ‘typical’ workflow?

“It’s seems like it’s different with each product,” Bragg replies. “We are kind of messy when it comes to project management. But I guess there’s some common things. We end up making a lot of prototypes. For the Reverb we made six, I think. I’m not sure if a professional engineer (I never got my official designation) would gasp in horror at that number of revisions, or if they’d say ‘That’s nothin’!'”

“We sometimes make some pretty big changes to the UI, which necessitates a lot of hardware and software changes. I’ve attached a PDF of an early Reverb [pictured below] and also pics of all the Reverb UI board versions.


Empress Reverb circuit

Image credit: Empress Effects

“Sometimes we think we’re ready with a product and take it to NAMM [the annual National Association of Music Merchants trade show, held in the US] and it turns out we’re not. The big example is when we took our Tape Delay to NAMM and people were like, ‘$250 and no presets?’ People were kind of blah about it. So we had to do a pretty big revision, adding presets and some other stuff.”

At Empress, each major new pedal the company has launched has proven to be a springboard to the next major new pedal, as the (small) team builds on what they all learned from designing and engineering the previous product. One of the key pedals in this regard for Empress was 2010’s Superdelay, Empress’ first really complex ‘big box’ pedal, as Bragg recalls.

“Back in 2010 we started playing around with the idea of moving to a really powerful DSP processor. We had used a Microchip dsPIC for our Superdelay product and we were really stretching what it was capable of. In the Superdelay, the processor is overclocked and we had to optimise a bunch of the code in assembly to squeeze out all the processor cycles we could.

“Whatever processor we chose, it had to handle the most intensive audio processing tasks, so the obvious choice was a reverb product. Reverb effects typically require impulse responses for the early reflections and a lot of delay lines to sound natural. We eventually ended up settling on the Analog Devices Blackfin BF514 to run the reverb algorithm.”

Bragg calls on Empress’ algorithm wrangler Jason Fee to explain more about the process for Reverb.

“A challenging aspect of algorithmic reverb design is getting small rooms simulations to sound accurate,” Fee says. “Algorithmic reverb typically uses feedback loops with all-pass filters to increase density over time, but with small room simulations you want a dense response very quickly for it to sound accurate. The first thing to try would be to shorten the loop lengths so that the density increases faster, but this tends to lead to a lot of resonances in the response which equates to a really metallic tone.

“Instead, we used an interesting approach in that we used an actual room impulse response which we convolve with incoming signal for the first part of the tail and then used an algorithmic reverb to provide the latter part of the tail. Getting the transition to sound right took a lot of finesse but the result is a very realistic small room sound.”

In moving from Superdelay, via Echosystem, to Reverb, always with a third eye on future possibilities, Empress’ approach of offering users a comprehensive suite of creative options out of the box, plus the ability to constantly update and expand the pedal’s feature set via firmware updates, gradually fell into place, pointing a path to ZOIA.

“The Reverb was our introduction to using a complex DSP processor,” Bragg explains. “Since I was handling a lot of the back-end, I spent a ton of time debugging peripheral issues. They always seem to be the most time-consuming issues. With software, you can set breakpoints, see what all your data looks like in a watch window, test hypothesises quickly.

“When your SD card isn’t writing as fast as you need it to, you have to bust out the logic analyzer, and the multimeter, and the scope. Then there are those issues where your code runs fine under an emulator, but booting it from flash fails once every 20 times.

“So you go down tons of rabbit holes, to finally discover some obscure 05-00-0490 anomaly! So my challenges really had nothing to do with the reverb aspect of the product. My challenges were ‘Why does RAM suck? Why does the cache suck? Why is there a stack overflow? Why has the I2C stopped working?'”

All that hard benchwork seems to have paid off, the internet’s pedal enthusiasts – of which there are legion – going gaga for the ZOIA. There are countless Instagram musician accounts now regularly posting short video clips of whatever fresh’n’freaky insanity they’ve created with a ZOIA, a Reverb and a MIDI keyboard. There haven’t been many pedals that have created quite the revolution in the heads that ZOIA has triggered.

Crucially, for Empress and the rest of us, as much as for himself, Bragg is as enthused about effects pedals as ever. Reflecting on the huge variety of boutique pedal companies around the world, often just one or two-person operations, fellow aural astronauts exploring strange new frontiers in space and sound, Bragg is pretty happy with the state of affairs: “It does seem like a strange wonderful industry. It’s nice that the market pie is so fragmented. Four companies own 50 per cent of the [guitar] amp market. By comparison, the top 10 best-selling pedals companies don’t even own 50 per cent of the pedal market. That’s nice!

“There are tons of pedal companies that come out of nowhere with a new idea that shakes things up and they sell enough to make a living. Making guitar pedals?!? Freaking mental!!”

Empress Effects ZOIA product page

View from Brussels: Energy disunion

August 19th, 2019 no comment

Nearly one million homes and businesses were left without power on 9 August after a gas-fired power plant and wind farm both went offline, triggering the UK grid’s fail-safe measures.

Chaos subsequently hit the railways and some airports, as signals and terminal power went down for about two hours.

An in-depth investigation into what truly went wrong is still under way and is expected to last several weeks, with preliminary findings due out this week.

While National Grid called the event “highly unusual” and “without precedent”, in what has been dubbed the worst blackout in a decade, its impact actually goes beyond the UK’s shores.

As part of the European Union’s efforts to guarantee energy supply, reduce reliance on geopolitically toxic imports and decarbonise the bloc’s ecological footprint, an overarching strategy known as the Energy Union was launched in 2015.

That involved increasing the scope of collaboration between countries on power management, reforming energy market rules and rethinking how the EU wants to keep the lights on in the years to come.

Four years later, the outgoing Commission concluded in April that the Union had largely been completed, thanks to a raft of new energy laws that will govern renewables and efficiency measures, as well as new pipelines and cables.

Interconnectors are a big part of the energy plan, as they are supposed to allow clean power to make its way across the EU to where it is needed and help countries make their energy supply more robust.

In an ideal scenario, surplus cheap and clean wind power generated in Portugal would be able to be utilised in Finland, for example.

But the UK blackout has revealed that National Grid has been restricting the use of three of the cross-Channel cables that it co-owns, reportedly to limit the risk of power cuts if any of the interconnectors were to trip.

The company has 4 gigawatts of continental power on its books, which according to a National Grid source, is regularly limited to under 75 per cent of each cable’s capacity when there is an increased risk of failure. UK power needs top the 32GW mark.

Firms are actually paid not to use the continental electricity supplied by the interconnectors and the costs are then passed on to bill payers.

Blackouts can be caused when there is a lack of inertia in the grid provided by the facilities generating electricity, which is particularly true of renewable energy sources like solar and wind, as they lack the large turbines of traditional power plants.

Inertia affects the grid’s frequency, which is supposed to stay as close to a standard 50Hz. A significant deviation away from 50Hz will activate fail-safes.

In early 2018, digital clocks linked to the grid’s frequency were showing an incorrect time after a power dispute between Serbia and Kosovo meant the reading dropped to 49.996Hz, illustrating how delicate the electricity network can be.

Relying too heavily on imported electricity, without a modern grid to handle the worst-case scenario, is a risk that the UK’s electrical network is at least aware of.

The UK’s odd relationship with its partners across the Channel, painfully mirrored by its protracted attempts to sever ties with the EU, looks set to continue in the energy sector too, as more interconnectors are on the way.

Undersea cables are in the works that will increase capacity by more than 3GW, by linking the UK grid to Denmark, Norway and increasing flows to and from France.

If management policy of limiting how much power can actually be used by domestic consumers continues, rather than updating the grid to be better suited to a world of green power and flexible energy needs, a truly interconnected European grid will remain a pipe dream.

View from India: Telecoms sector sharing infrastructure to improve services

August 19th, 2019 no comment

The Indian telecommunications sector has undergone a pioneering transition in the last two decades to become, in terms of number of subscribers, the world’s second-largest telecommunications market. India is also one of the fastest-growing telecommunications markets.

The last two years have been a period of consolidation for India’s telecoms market. Presently and effectively there are only three private entities and two public sector undertakings (PSUs) that provide access services.

These are vertically integrated service providers. Their telecommunications services include wireline and wireless access, internet, national long distance (nld), international long distance (ild) and enterprise business services. Naturally, TRAI’s consultation paper has reviewed business opportunities for Infrastructure Providers Category-I (IP-I). They provide assets such as dark fibre, right of way, duct space and tower on lease/rent out/sale basis to licensees of telecom services.

What is noteworthy is that IP-I has played a significant role in making affordable telecom services available. The deployment of shared tower infrastructure by IP-I has led to the rapid growth of mobile networks. Over the years, the telecom tower industry has emerged as a trendsetter in the infrastructure segment. The telecom tower companies are registered under IP category-I with the department of telecommunications (DoT). Some of the telecom service providers (TSPs) have also hived off their tower assets into separate entities; these hived-off entities have obtained IP category-I registration.

In order to widen the horizon of the telecoms communications sector in India, this consultation paper recommends further sharing of active and passive infrastructure. By doing so, the investments remain low with a favourable economy of scale. Passive infrastructure sharing allows operators to share the non-electrical, civil engineering elements of telecommunication networks. This might include rights of way or easements, ducts, pylons, masts, trenches, towers, poles, equipment rooms and related power supplies, air conditioning and security systems. Active infrastructure sharing involves sharing the active electronic network elements – the intelligence in the network – embodied in base stations and other equipment for mobile networks and access node switches and management systems for fibre networks.

Infrastructure sharing can broaden the availability of spectrum and data consumption. Even though, in India, presently, the total data consumption is one of the highest in the world, per user data consumption is much less in comparison to many countries in East Asia, Europe and America. As per the Digital Economy and Society Index (DESI) Connectivity Report 2019, published by European Commission, internet traffic per capita in Western Europe is currently 44Gb per month and mobile data currently represents only 6 per centof European internet traffic. In contrast, in India, the share of wireline broadband access in total data consumption is negligible.

TRAI has also presented the draft amendment regulations for telecommunication mobile number portability (MNP) which means users can switch telecom carriers without changing the mobile number. The regulations also extend to per port transaction charge which refers to the charge payable by the Recipient Operator to the Mobile Number Portability Service Provider for processing each porting request of a mobile subscriber number. The focus is also on dipping charge or the charge payable by the service provider who uses the query response system of the MNP service provider for obtaining location routing number (LRN) for correct routing of the number dialed.

As per the draft amendment, the per port transaction charge is proposed to be Rs 5.74. TRAI in its earlier recommendation had suggested a lower fee of Rs 4. The new rate of Rs 5.74 is scheduled to be applicable from 30 September 2019.

The draft states that a subscriber can retain his or her existing mobile telephone number when he or she wishes to switch from one service provider to another or from one technology to another of the same service provider using Mobile Number Portability service within the same Licensed Service Area (LSA) as well as Pan India in any LSA. The Mobile Number Portability is operational in India since 2009, when MNP service licences were issued to two Mobile Number Portability Service Providers (MNPSPs) by DoT.

Most stakeholders feel that there should be a consolidated per port transaction charge subsuming the charges for ancillary services also. It is also felt that number return, subscriber reconnection and non-payment disconnection should be charged separately, whereas port cancellation charge should be included in the per port transaction charge (PPTC). Further, database download should be provided free of charge.

In the case of dipping charges, stakeholders are of the opinion that it should continue to be under forbearance and any service provider that requires the dipping services from the MNPSPs can avail the same on mutually agreed terms with the MNPSPs.

It is hoped that both draft regulations will positively impact the telecom sector, as it has a significant role to play in fulfilling many national programmes. The sector is expected to be a key contributor to various government programmes such as Digital India, Make in India and the development of Smart Cities. These programmes present a host of opportunities for the telecoms sector, especially for the infrastructure providers, as telecommunications infrastructure is the bedrock for achieving the vision of Digital India. Overall, this is hotly anticipated to improve revenues in the telecoms sector.

Fracking claimed to be less contaminating to groundwater than conventional oil and gas

August 16th, 2019 no comment

High-volume hydraulic fracturing, commonly known as fracking, injects water, sand and chemicals under high pressure into petroleum-bearing rock formations to recover previously inaccessible oil and natural gas. This method led to the current shale gas boom that started around 15 years ago.

Conventional methods of oil and natural gas production, which have been in use since the late 1800s, also inject water underground to aid in the recovery of oil and natural gas.

“If we want to look at the environmental impacts of oil and gas production, we should look at the impacts of all oil and gas production activities, not just hydraulic fracturing,” said Jennifer McIntosh, a University of Arizona professor of hydrology and atmospheric sciences.

“The amount of water injected and produced for conventional oil and gas production exceeds that associated with fracking and unconventional oil and gas production by well over a factor of 10.”

A team looked at how much water was being injected underground by petroleum industry activities, how those activities change pressures and water movement, and how those practices could contaminate groundwater supplies.

While groundwater use varies by region, about 30 per cent of Canadians and more than 45 per cent of Americans depend on the resource for their municipal, domestic and agricultural needs.

“There’s a critical need for long-term – years to decades – monitoring for potential contamination of drinking water resources, not only from fracking, but also from conventional oil and gas production,” McIntosh said.

“What was surprising was the amount of water that’s being produced and re-injected by conventional oil and gas production compared to hydraulic fracturing.

“In most of the locations we looked at – California was the exception – there is more water now in the subsurface than before. There’s a net gain of saline water.”

Oil and gas production activities can have environmental effects far from petroleum-producing regions.

For example, previous studies show that operating disposal wells can cause detectable seismic activity more than 90km away. Conventional activities inject lower volumes of water and at lower pressure, but take place over longer periods of time, which may cause contamination over greater distances.

There are also thousands of active, dormant and abandoned wells across North America. Some are leaky or were improperly decommissioned, providing possible pathways for contamination of freshwater aquifers.

There is little consensus as to the scale of the problem and decommissioning can cost anywhere from a few billion to a few hundred billion dollars, depending on the size.

“We haven’t done enough site investigations and monitoring of groundwater to know what the liability really looks like,” said Grant Ferguson, University of Saskatchewan, who also worked on the project. “My guess is that some wells probably should be left as is and others are going to need more work to address migration of brines and hydrocarbons from leaks that are decades old.”

A separate study released this week by researchers from Cornell University warned that the boom in fracking for shale has ‘dramatically increased’ global emissions of methane in the past decade.

Model developed to cut carbon emissions from buildings by 80 per cent

August 16th, 2019 no comment

Energy use in buildings such as heating, cooling and lighting, is responsible for over one-third of all CO2 emissions in the USA.

The new model will require the installation of highly energy-efficient building technologies, new operational approaches, and electrification of building systems that consume fossil fuels directly.

“Buildings are a substantial lever to pull in trying to reduce total national CO2 emissions since they are responsible for 36 per cent of all energy-related emissions in the US,” said Jared Langevin, a research scientist at Lawrence Berkeley National Laboratory and lead author of the study.

“Because the buildings sector uses energy in a multitude of ways and is responsible for such a large share of electricity demand, buildings can help accelerate the cost-effective integration of clean electricity sources on top of contributing direct emissions reductions through reduced energy use.”

The researchers considered three types of efficiency measures, technologies with higher energy performance than typical alternatives, such as dynamic windows and air sealing of walls, sensing and control strategies that improve the efficiency of building operations, and conversion of fuel-fired heating and water heating equipment to comparable systems that can run on electricity.

They also considered how parallel incorporation of renewable energy sources into the electric grid would shift emissions reduction estimates from each building efficiency measure and the buildings sector as a whole.

“While building CO2 emissions are quite sensitive to the greenhouse gas intensity of the electricity supply, measures that improve the efficiency of energy demand from buildings need to be part of the solution,” Langevin said.

“Getting close to the 80 per cent emissions reduction target requires concurrent reductions in building energy demand, electrification of this demand, and substantial penetration of renewable sources of electricity – nearly half of annual electricity generation by 2050.

“Moreover, buildings can support the cost-effective integration of variable renewable sources by offering flexibility in their operational patterns in response to electric grid needs.”

The team proposed the installation of energy-saving retrofits and upgrades to walls, windows, roofs and insulation. The introduction of smart software could also be used to optimise when, where, and to what degree energy-intensive building heating, cooling, lighting and ventilation services should be provided.

The researchers stressed that policymakers will need to take action for these measures to be broadly rolled out.

“Regulations and incentives that support the sale of more efficient, less carbon-intensive technology options, early-stage research and development that drives breakthroughs in technology performance, aggressive marketing of those technologies once developed, training for local contractors charged with technology installation, and consumer willingness to consider purchasing newer options on the market are all needed to achieve the 80 per cent emissions reduction goal by 2050,” Langevin said.

“We look forward to periodically revisiting this analysis to reassess where emissions from the buildings sector stand relative to the 2050 target, under both business-as-usual and more optimistic scenarios of efficient technology adoption and renewable electricity supply.”

Model developed to cut carbon emissions from buildings by 80 per cent

August 16th, 2019 no comment

Energy use in buildings such as heating, cooling and lighting, is responsible for over one-third of all CO2 emissions in the USA.

The new model will require the installation of highly energy-efficient building technologies, new operational approaches, and electrification of building systems that consume fossil fuels directly.

“Buildings are a substantial lever to pull in trying to reduce total national CO2 emissions since they are responsible for 36 per cent of all energy-related emissions in the US,” said Jared Langevin, a research scientist at Lawrence Berkeley National Laboratory and lead author of the study.

“Because the buildings sector uses energy in a multitude of ways and is responsible for such a large share of electricity demand, buildings can help accelerate the cost-effective integration of clean electricity sources on top of contributing direct emissions reductions through reduced energy use.”

The researchers considered three types of efficiency measures, technologies with higher energy performance than typical alternatives, such as dynamic windows and air sealing of walls, sensing and control strategies that improve the efficiency of building operations, and conversion of fuel-fired heating and water heating equipment to comparable systems that can run on electricity.

They also considered how parallel incorporation of renewable energy sources into the electric grid would shift emissions reduction estimates from each building efficiency measure and the buildings sector as a whole.

“While building CO2 emissions are quite sensitive to the greenhouse gas intensity of the electricity supply, measures that improve the efficiency of energy demand from buildings need to be part of the solution,” Langevin said.

“Getting close to the 80 per cent emissions reduction target requires concurrent reductions in building energy demand, electrification of this demand, and substantial penetration of renewable sources of electricity – nearly half of annual electricity generation by 2050.

“Moreover, buildings can support the cost-effective integration of variable renewable sources by offering flexibility in their operational patterns in response to electric grid needs.”

The team proposed the installation of energy-saving retrofits and upgrades to walls, windows, roofs and insulation. The introduction of smart software could also be used to optimise when, where, and to what degree energy-intensive building heating, cooling, lighting and ventilation services should be provided.

The researchers stressed that policymakers will need to take action for these measures to be broadly rolled out.

“Regulations and incentives that support the sale of more efficient, less carbon-intensive technology options, early-stage research and development that drives breakthroughs in technology performance, aggressive marketing of those technologies once developed, training for local contractors charged with technology installation, and consumer willingness to consider purchasing newer options on the market are all needed to achieve the 80 per cent emissions reduction goal by 2050,” Langevin said.

“We look forward to periodically revisiting this analysis to reassess where emissions from the buildings sector stand relative to the 2050 target, under both business-as-usual and more optimistic scenarios of efficient technology adoption and renewable electricity supply.”

Microsoft gets caught listening to Skype user recordings

August 15th, 2019 no comment

The company updated its privacy policy and other online pages to reflect the fact that human ears may be privy to voice recordings captured through Skype or Cortana, following an investigation into the practice by Motherload.

“We’ve updated our privacy statement and product FAQs to add greater clarity and will continue to examine further steps we might be able to take,” a Microsoft spokesperson said, replying to Motherload.

“We realised, based on questions raised recently, that we could do a better job specifying that humans sometimes review this content,” they added.

The firm says it collects voice data to improve some products, such as checking translations.

Facebook, Apple and Google have all been caught doing the same thing in recent weeks, but unlike Microsoft all three have pledged to stop listening in, at least for the time being. 

Amazon chose a different approach for recordings captured using its Alexa virtual assistant by adding the option to opt out of human reviewers altogether.

The snippets of Skype audio acquired by Motherload were mostly between five and 10 seconds long, although their source stated that longer clips are sometimes analysed.

“We take steps to de-identify the content provided to vendors, require non-disclosure agreements with all vendors and their employees to protect our customer’s privacy and require that handling of this data be held to the highest privacy standards set out in European law,” the spokeswoman said.

“At the same time, we’re always looking to improve transparency and help customers make more informed choices.”

Hands-on review: Ekster voice-activated smart wallet

August 13th, 2019 no comment

I’ll be honest. When the email announcing the arrival of a voice-activated smart wallet pitched up in the E&T inbox, my assumption was that this would be an unnecessarily complicated bit of technology for technology’s sake. A case for your contactless payment cards that only allows them to be activated if you’ve got sufficient funds, maybe. Or that gives you the reassurance of having to speak a passphrase as well as tapping it on a card reader for an additional layer of security.

In fact, the Ekster range of wallets are much more practical than that. A genuine solution to the risks of how we pay for things these days that also looks good. And after raising over $1,000,000 in previous crowd-funding campaign, the company which claims to be the world’s largest smart wallet brand has launched a third-generation range of its products.

What you get with the smart wallet is a compact, classy looking holder for your cards and a small amount of cash that comes with some neat security-related features. The principal one is the protection it provides against the danger of someone equipped with a dodgy RFID reader skimming your card details without your knowledge. Nothing new about that when contactless payments are such a part of the mainstream these days that you can pick up an aluminium case in your local corner shop for a couple of pounds. Ekster takes it to another level.

Both versions of wallet – the Parliament and the Senate – incorporate an ingenious mechanism that holds cards securely but pops them out at the press of a button. Hold the cardholder upright and you can drop in a stack of 4, 5 or 6 cards (depending on whether they’re embossed or not). Push them firmly in and they’re held firmly in place until you hit the eject button, when they emerge stacked ready to select the one you want like a card sharp’s deck. The friction mechanism is smooth and there was no danger of them flying out across the floor.

We tested the Parliament bi-fold wallet, using some old supermarket loyalty cards for a dry run. As you’d expect, getting them in and out took a little practice, particularly plucking one card from a stack of several by carefully holding each side. With nothing covering the slot they come out of it’s a leap of faith relying on the mechanism to hold them in but it resisted being held upside down and firmly shaken. And the eject button is one of those pleasing physical interfaces that you’d probably find yourself playing with even when you didn’t need to take a card out.

It’s compact though. And with a surface area barely larger than a standard credit card and about 1cm thick would slip easily into a pocket. Two pockets can hold a few banknotes, other cards or a couple of business cards. Ekster suggests you could fit some coins in there as well, although that would spoil the look of something that’s been designed with slimline elegance in mind.

So your cards are protected from skimmers. What’s smart about that?

Picture the scene. You’re enjoying the leisurely stroll back to your hotel or Airbnb apartment after an evening meal when you realise that although you can feel the reassuring heft of the mobile phone in your back pocket, your cardholder isn’t where it should be. You remember popping out a card to pay for dinner and slotting it back into the holder, but didn’t you then get distracted talking about what tip to leave and lay it down on the table?


Tracking the Ekster voice-activated smart wallet

Image credit: Ekster

Ekster won’t avoid the panicked rush back to the bar or restaurant, but it’ll give you a level of reassurance. The solar-powered tracking card that fits neatly inside the wallet uses the same technology implemented by global shipping companies to track containers. As long as you’ve remembered to download the Chipolo app to your phone, create an account (login with Google or Facebook is possible but you might prefer to just use an email address and separate password) and pair with the tracker, you can locate it – and your wallet – anywhere in the world. If you think it’s nearby, tap ‘ring’ and it’ll pipe up. If not you can use ‘mark as lost’ to activate a crowd network locator to help with your search.

Three hours in daylight will give you enough power for two months with of tracking functionality, although Ekster recommends leaving the card outside the wallet for a couple of hours every month to ensure battery longevity. It’s designed to fit neatly in the Ekster wallet, but obviously you could use it to locate anything you’re worried about mislaying.

And what makes it a voice-activated wallet? Compatibility with Google Assistant and Amazon Alexa. You can now locate the card and whatever valuable you’ve attached it to on your phone using either of the two apps. Your assistant can make your wallet ring, or it can tell you where it was last seen by the Chipolo app.

With prices starting at $69 for a cardholder or $79-89 for a wallet, and the tracking card an additional $49, this isn’t a cheap solution to preventing someone grabbing your card details, but it looks great, is built to last, and you probably only need to use the location function once to justify paying for it.

For full details and to order see ekster.com.