The Good, the Bad and the Reality of Apple’s New Payments Processing
Apple Pay Launched in the UK this past week with ads blazing for the new service, which the company hopes will reduce their customers’ dependence on credit cards. The contactless service allows customers at select retailers with an iPhone (or soon, presumably, an Apple Watch) to simply scan their devices for over a card reader for purchases which are less than 20 Euros. The service has already been announced at a variety of high street retailers, and is expected to have a very strong launch in the U.K., a follow-up to their launch in the U.S. in November of last year.
However, Apple Pay has not gained very much traction at all in the U.S.– so why would it gain traction in the U.K.? The answer to that is actually quite simple. The U.S. market is much more fractured when it comes to payments processing, which slows down adoption of newer technologies in payment processing. Chip-and-Pin processing, for example, has not been adopted in the U.S. as a standard.
But chip-and-pin processing is precisely what opened the door to Apple Pay throughout the European Union: the technology is extremely flexible, and allows for newer technologies like Apple Pay to enter the payments processing space and get their own share of fees as opposed to older payment processors. In the meantime, U.S. retailers are hoping to have a technology standard in place which will affordably accept Apple Pay by the end of the year.
Verdict: Apple Pay should do quite well in the U.K. and throughout Europe. Naysayers pointing to the American lack of adoption are unaware of precisely how difficult it is to improve device and processing standards in the U.S., a problem which doesn’t even really exist in Europe at all.