What’s cooking @ AWS NYC summit

Here’s a line up of offerings (+enhancements)



Courtesy: Jeff Bar

What really caught or should catch my attention ~

“Amazon Macie” -> https://aws.amazon.com/blogs/aws/launch-amazon-macie-securing-your-s3-buckets/

In 10000 ft view, Macie would enable you/us with ‘Deep insights within your Data’ but we are not talking about BI here. It’s all about continuous monitoring of your Data (laundry list of actions – Sorting, tagging, grouping, pattern matching, anomaly detecting, integrity trackers), sounds intriguing!

Forget not NLP and ML duly take their seats in this very offering

Quick def:

Macie can automatically discover and classify your data stored in Amazon S3. But Macie doesn’t stop there, once your data has been classified by Macie, it assigns each data item a business value, and then continuously monitors the data in order to detect any suspicious activity based upon access patterns



FOG’gy to say CLOUD to be EDGE’d

Technology wave is nothing but a cycle. We have been through constant technology cycles where we tend to ‘time capsule’ into future which derived from the past

Mainframes > Open Systems > Virtualization (Hypervisor) > Bare Metal (containers)

When we started the computing in old ages we got into model of ‘start small’ ‘keep all local’ and in a very ‘decentralized’ way. During pre/post .COM era we did tend to trust the CoLo providers. We opened up to Centralization of computing. Data kept being built and managed at CoLo’s. Then Cloud born which is an authentic ‘Centralization’ way of computing.

All the ramblings around the ‘EDGE’ would eat the Cloud in tech circles set me up a curious platform to write this piece

What in the world the EDGE is?

edge computing is a method of accelerating and improving the performance of cloud computing for mobile/end users. Computational and Data processing stack would be running on users’ very end devices and Cloud would be purely consumed for fail safe/recovery/long term storage needs

This one resonates the anti-Cloud vibe, Isn’t? Meaning Decentralization Era again?

Big YES! We would get decentralized yet again. Remember the cycles we touched early in this page!

So why do people think edge computing will blow away the cloud? This claim is made in many online articles. Clint Boulton, for example, writes about it in his Asia Cloud Forum article, ‘Edge Computing Will Blow Away The Cloud’, in March this year. He cites venture capitalist Andrew Levine, a general partner at Andreessen Horowitz, who believes that more computational and data processing resources will move towards “edge devices” – such as driverless cars and drones – which make up at least part of the Internet of Things. Levine prophesies that this will mean the end of the cloud as data processing will move back towards the edge of the network.

In other words, the trend has been up to now to centralise computing within the data centre, while in the past it was often decentralised or localised nearer to the point of use. Levine sees driverless cars as being a data centre; they have more than 200 CPUs working to enable them to operate without going off the road and causing an accident. The nature of autonomous vehicles means that their computing capabilities must be self-contained, and to ensure safety they minimise any reliance they might otherwise have on the cloud. Yet they don’t dispense with it.

The two approaches may in fact end up complementing each other. Part of the argument for bringing data computation back to the edge falls down to increasing data volumes, which lead to ever more frustratingly slow networks. Latency is the culprit. Data is becoming ever larger. So there is going to be more data per transaction, more video and sensor data. Virtual and augmented reality are going to play an increasing part in its growth too. With this growth, latency will become more challenging than it was previously. Furthermore, while it might make sense to put data close to a device such as an autonomous vehicle to eliminate latency, a remote way of storing data via the cloud remains critical

For the last several years, enterprises have focused on cloud computing, and have been developing strategies to “move to the cloud” or at least “expand into the cloud.” It’s been a one-way, straight highway. There’s a sharp left turn coming ahead, where we need to expand our thinking beyond centralization and cloud, and toward location and distributed processing for low-latency and real-time processing. Customer experience won’t simply be defined by a web site experience. The cloud will have its role, but the edge is coming, and it’s going to be big

I’m reminded of an unattributed quote that seems to apply every time a new idea pops up in the world of technology:

“Look back to where you have been, for a clue to where you are going.”

Google Cloud Next’17 – Recap

Google Cloud Next’17 Conference – Recap with a few snippets!

  • Google is trying to stay appeal to Enterprises (incl Legacy ones!)
  • Google tried to speak the language Enterprises love to hear – MultiCloud, Partnering with ERP giants like SAP
    • Show casing the lineup of Enterprise customers and case studies (HSBC, Colgate, Schlumberger, Disney, The Home Depot, et al..)
  • Customers respect the fact, Google got the true global presence with their inter-regions connectivity with well enhanced security is a stand out factor

The centrifugal force for all the Enterprises and shadow customers, Google has got a muscle to push and deliver Machine Learning~AI and Big Data capabilities via Cloud

Leveraging AI/ML in Microservices Platform

AI/ML could well be leveraged in Microservices platform where the building block components/services cause a ‘racing condition’ for resources or to predict and suffice a resource management.

The ability to deploy machine-learning applications as containers and to cluster those containers has several advantages, including:

  • The ability to make machine learning applications self-contained. They can be mixed and matched on any number of platforms, with virtually no porting or testing required. Because they exist in containers, they can operate in a highly distributed environment, and you can place those containers close to the data the applications are analyzing.
  • The ability to expose the services of machine learning systems that exist inside of containers as services or microservices. This allows external applications, container-based or not, to leverage those services at any time, without having to move the code inside the application.
  • The ability to cluster and schedule container processing to allow the machine learning application that exists in containers to scale. You can place those applications on cloud-based systems that are more efficient, but it’s best to use container management systems, such as Google’s Kubernetes or Docker’s Swarm.
  • The ability to access data using well-defined interfaces that deal with complex data using simplified abstraction layers. Containers have mechanisms built in for external and distributed data access, so you can leverage common data-oriented interfaces that support many data models.
  • The ability to create machine learning systems made up of containers functioning as loosely coupled subsystems. This is an easier approach to creating an effective application architecture where you can do things such as put volatility into its own domain by using containers.

AWS volumes go Elastic!

AWS introduced the most customer demanded feature “Elastic Volumes” available for all current-generation EBS volumes attached to current-generation EC2 instances.

This new capability allows you to modify configurations of live EBS volumes with a simple few console clicks. You can now dynamically increase volume size, tune performance (change IOPS) or change the volume type of any new or existing current generation volume with no downtime or performance impact.

You can continue to use your application while the change takes effect. This Elastic Volumes feature is available in all AWS regions at no additional cost