Tuesday, January 16, 2024

The shift from datacenter centric to application centric approach

Traditional IT, meaning the on-prem datacenter oriented approach we've had like the last decades had a pretty clear distinction between hardware and software. You had network and servers and then software on top of the servers. All networks and servers were hardware and for the servers you had software you installed onto their operating systems.

Then came virtualization on servers. So now you virtualized the server hardware from the operating system and this allowed you to have multiple virtual servers on one hardware. The prediction was back then that selling hardware will come almost to halt. That wasn't quite true in the end as we now know. 

Picture: Dimi Doukas using Dezgo.com AI picture generator
 

Then next step was that as the servers were on the same hardware, the traffic between these virtual servers didn't necessarily go even outside to the actual network. So there had to be a proper way between these virtual servers to communicate with each other and also the ability to divide and restrict this communication between different IP-addresses. And eventually to be able to communicate from these virtual networks into the physical network outside the host server hardware. So we got virtualised networks.

At this point we were still pretty much within our own datacenter. But already this came to some of the managed service providers (MSP) an opportunity to start offering platform as a service (PaaS), 'renting' the platform for customers to install and run their applications on those platforms. MSP managing the layer underneath like servers network, storage and middleware, serving the application needs under the hood. Or another way was the Infrastructure as a Service (IaaS) where the customer managed everything, only not owning it, but renting it. 

The next phase was taking these IaaS and PaaS environments and automating them such a way into a location outside of the customer datacenter, that it was possible to offer self managed environments where scaling and implementing new services was fast and didn't need any interference from the IT staff to get the services up and running. This was called cloud. It was still a datacenter, but now owned by a large service provider who actually just took everything several steps further. Automation and ability to scale was the distinctive factors. 


Picture: Dimi Doukas using Dezgo.com AI picture generator


The servers and networking were not added one by one into the datacenter, but there were whole ready built blocks of datacenter that were implemented or changed when needed- brought by trucks. And the whole block had managed and automated by the operating systems and management software that didn't exist but needed to be built in-house. As none of the products that time on the market wasn't built for such use and scale, Google, as one of the pioneers on cloud industry, needed to build their own code for managing and scaling everything - all the way from IP and traffic management, load balancing the traffic, into automation, security and so on. Of course not everything was built from scratch but a lot of the needed features and capabilities were missing, so these environments were built as a unique solutions, never seen elsewhere int he world. But in the end these were not built as snowflakes, but as highly standardized and automated by the cloud platform provider.

So now we had our platform ready, but it was still for the monolithic applications. The ones that were built to run in one big block of code, which was updated time to time with a new bigger block of code and needed the application to be taken down for the the time of the maintenance, usually this took a weekend and needed to be scheduled carefully as the services would be down for that time for the customers. Not much difference from the application or the user experience side to the earlier, except you didn't have to own your own datacenter anymore to be able to run the applications and offer the services to your customers. 


The downside in this setup was that it wasn't agile and the changes on the markets and customer needs started to change faster than the application developers could react. Building a huge monolithic application including hundreds of thousands of lines of code, sometime millions, was slow and changing the code and building a new version wasn't much faster. It was easier when the applications were smaller, but the technology offered so many possibilities and new services, that amount of code needed for applications had grown exponentially year by year. Far were the days when you could build your code by copying it from the IT magazine and writing that few pages of code to your computer (this was still the case on 80's as those, who have been on the industry a bit longer, do remember). This route had come to its end and something needed to be done. Even though the computer got faster and faster each year, that didn't help, as the applications started to be too inflexible and cumbersome, just too big.


The change came with another era of virtualization, where the application was detached from the services. Until that the application was still stitched together with the operating system. You built the application for the operating system you run on the server. Operating system was virtualized from the hardware, but the application was still bind to the operating system, needing its libraries to be able to do things and calling directly those services operating system was offering. So what was needed was abstracting the operating system layer by creating another layer which separated the application and the operating system services. As earlier we had one hardware that could run many different virtual operating systems on top of the virtualization layer, which took care of being in the middle and passing back and forth what the operating system needed from the server hardware. Now we did the same for the applications, building a virtualization layer between operating system and application, so that we could build smaller applications, which didn't have to include all the operating system related system tools and libraries to be able to run.


This made it possible to build smaller applications for smaller use cases. And by using application programming interfaces (APIs) these smaller applications could specialize on smaller tasks, being more efficient, small and easier to build and maintain. And combining these smaller services together, as needed, you could create an application which is far more flexible. Bringing changes or new features to the application was easier, as we didn't need to consider the whole monolith mammoth application, but we could just pick the small part of the application and totally replace it with a new one.

Picture: Dimi Doukas using Dezgo.com AI picture generator


This is pretty much where we are today. We have so much legacy still that what has happen is that we create new applications or fronts (user interfaces) to these applications with modern ways, but still using some legacy sources as part of it. This could be a database having sensitive information, which we want (or are bind to due the regulations) to keep within our on-prem datacenter. There's still reasons for not taking everything to cloud. The idea of taking everything to cloud doesn't work for everyone and actually there's been the past couple of years s trend to take back some of the services to on-prem, for various reasons. One of the reasons being costs and complexity. And this complexity is growing. We're seeing more hybrid and multi-cloud environments, where it is hard to maintaining the organization's policies and security, when the platforms and the tools are different. This complexity is a security hazard and makes it really difficult to be able to have the visibility and to see what is the current security posture.


What this means in the end is that complexity will be still growing. There's an evolution going on where APIs are going to be bigger attack surface, due to the fast growth of using them. And the lack of visibility, management and security of APIs. With the complexity there's much more to do, but the teams are not growing at the same pace. So the solution for this will be utilizing more automation. For answering to the requirement the business is setting to IT, to be fast, agile and secure, there's no other way, but simplify trivial tasks and later on more complex tasks by automating them. This can be a matter of 'life and death' for companies in the future to stay in the game. For many companies and organizations the role of IT will grow and is seen already as a strategic initiative to modernise they way we do things. And since the size of teams is still very limited and the demands keep growing, we have two things we can do to keep up: automation and partial outsourcing or buying more resources and services from outside. At the same time still keeping the core skills and knowledge in-house. These new directions will bring new business opportunities for IT-companies and will continue to change the way we operate with our applications.




Saturday, January 13, 2018

Set-top box phenomena happening also now with the IT products?


Today we have many networking and security features which we need in a hybrid cloud environment. We need 'firewalling' on both network and application level. We need also all kinds of intrusion prevention techniques. Authentication is a key as well as federating the authentication to different services. For most of these things we have nowadays a separate device. Some of the features can be done within the same device, but not all. And if you're a company using both local datacenter and different cloud based applications like O365, Salesforce etc, then you'll be using many different cloud environments. In that case you'll need the same kind of solutions in both physical and in the virtualized environments. And keep in mind that both the local datacenter and the cloud service (Azure, AWS, Google etc) are based on hardware which has a virtualized layer on top. In the end they're all just datacenters, whether it's your or someone else's datacenter. So we need to protect both layers, physical and virtual. And don't count on it that the service provider, into who's cloud you put your virtual servers and applications, would be protecting your data and applications. That is always your responsibility in the end - you're the one losing most if and when something happens.

For most of the security needs we have either hardware or software product to do it's job. The problem is that there's no silver bullet, not a one solution that could tackle everything. So we need multiple point-solutions to do its job. This reminds me of the set-top box era in the television industry, where there was needed awkward separate products sitting on top of the TV-set. And as the time passed those products were integrated into the TV sets. We don't need those separate boxes anymore and the ones we had are now junk. The same will mostly happen to the products that we're using today as a separate solutions. They will migrate and vanish as a separate products. Instead new products will come, but for a totally different purpose. The ones we have today will be mostly gone in  a few years time.

So this puts a pressure to IT manufacturers to stay ahead and develop new things as the revenue stream that is coming now from the present products will be decreasing rapidly. It is also a chance for the newcomers to enter the market. The ones who are agile enough and who don't let the legacy products to dictate where to go or what to do. We're seeing the similar problem nowadays in the professional DSLR camera market, where the giants Canon and Nikon are struggling with their DSLR cameras that are still using mirror inside the camera body to produce the picture. They have a huge manufacturing stock with the stuff they've been selling for years and moving to something totally new and different seems to be very difficult and slow task to achieve. We've seen already for years that cameras are going to be mirrorless in the future and they're still not there in their professional cameras. This has given Sony with their mirrorless cameras a chance to steal the market share from these two. History has so many examples what happens when you stop developing something totally new and don't stay ahead and try to fight against the progress that happens with or without you.

So we who are working in the IT industry need to start thinking already years ahead. 5 to 10 years from now. What it's probably going to be like at that time and what it means to us? We need to start adjusting right away, not only our products but also our services and knowledge to meet those future demands. Only then we can still be on the market and even grow our market share.

It's time to place your bets ladies and gentlemen!

Thursday, September 08, 2016

Moving to the cloud. How to handle the user information in hybrid cloud?

Today most of the clouds are hybrid clouds

Companies are moving to the cloud - whether they notice they do it or not. Starting to use Microsoft O365 or CRM as a SAAS (Software As A Service) is using cloud service and moving parts of the software and services into cloud. That will actually be a hybrid cloud. Especially when this happens in small steps during longer period it might lead to forgetting something very fundamental - the user and data protection and legislations.

O365 is a great example. More and more companies have moved using it. Now what about authentication. Where do you keep your user credentials and what happens when the main site of the service will be down. Where are the credentials in that case? And can you be sure about it? And what if you have other services also in cloud? How will you offer these services easily to the user? Will you handle authentication for each service separately? And when one day you need to remove the user from everything he/she had access to? How will you do that?

The only certain way to make sure the user authentication and user data is not leaking out is to keep it to yourself, meaning in your premises. Buy the services from where ever you want. But don't push the user data into the cloud service. It's not necessary. There are several ways today to do the authentication against services in cloud without user credentials leaving your own premises. This is possible using f.ex. OAuth (more information about OAuth here) and SAML for example. Other benefits that can be offered to make the whole hybrid cloud accessing easier is to offer webtop for the user. There should be one place to access what ever the user is entitled to - based on how strong the authentication is, from what network the user is coming from and what client the user is using to access the service. Role and device based access management.

You can't forget the legislation either. In EU there will be GDPR (more information about GDPR here) that will be very strict about how and where you keep the user related data. In case it leaves EU 'In any event, transfers to third countries and international organisations may only be carried out in full compliance with this Regulation'  - also in the case when the service provider main site of the cloud service in EU area goes down and service is moved to a backup site (mostly outside EU). In practice that will be quite impossible and you should be looking for solution where you can be sure the user data is staying somewhere you can be sure it's not leaking out and can be easily managed from one place even if there are several different clouds you're using. In the end you also need to be able to track what ever the user has been using in order to find the related user data - in case the user wants to be forgotten and remove everything related to him/her. And to remove the user access from all the services at once.

As applications are moving to the web it becomes more and more about application services - not so much about the network. Network plays of course a major role in accessibility and user experience - but that is bread and butter which just needs to work and only then you can move in to the cloud.

If you're starting to use or if you're already using cloud services in hybrid way then one of the major things is to think what would be the easy and secure way of taking care of the authentication and federation. Before you've solved that, you're not ready to move into hybrid cloud. Luckily there are options for that. Not one silver bullet that's solving everything but there are good options that make your life easier, keeps the user credentials secure, makes user's life easier (SSO and webtop) and helps you follow the legislation. Make sure you take this in consideration with your IT department and service providers before you take the final step.




Tuesday, September 09, 2014

K.M.S.S. > Keep Me Simple and Stupid

K.M.S.S. Keep Me Simple and Stupid. Blog by Dimi Doukas
K.M.S.S. Keep Me Simple and Stupid
photo: Freedigitalphotos
We should have our solutions and offering simple. But we, the ones working in the IT industry, can't always have that luxury. The day you're not learning new things and following where we're heading, you're out. On the other hand we in the IT industry easily get blinded by the products and solutions and services and especially with the abstract things like 'cloud' and 'virtual'. We even have now virtual people working in our companies, virtual roles, virtual teams. Wonder who's going to get their paychecks? Which reminds me of the old saying 'Employee is pretending to work while the employer is pretending to pay the salary' :) Wonder how much this is true with the virtual roles...

Back to the subject :) K.M.S.S. (Keep Me Simple and Stupid), which is a twist from K.I.S.S (Keep It Simple and Stupid) has an important message behind it. If we in the industry think too complicated, then our solutions and offering will be too complicated and they will not meet the customer needs. Customers don't need complicated nor multi flexible solutions that are capable of doing everything IF needed. When taking a step back from the IT (as a tool) to the Business (as a master - meaning customer), we come to the fact that what business needs are not that complicated. You just have to look it from the business perspective. What do the business need to get certain business objectivities accomplished. And we should avoid the lego block way of thinking on product level, that 'if I have this product X then the Product Y would fit wonderfully and bring me all these new features which let me do what ever is needed.' Well, the fact is that the business who bought the IT as a tool in the first place, does not need to be ready to do 'everything'. That's the important part to realize. Even if they need to be agile it doesn't mean there is a need for an IT environment that grows independently without any real connection to the business. And that's not very uncommon actually.

Some years (actually closer to 10) ago there was a clear movement getting IT managers or IT directors into the board rooms to get that connection between business and IT. But my gut feeling is that we've come back from that and more and more IT has been left on it's own into reactive mode as from the CxO point of view it's probably often so complicated and gets so technical that it's easier to bypass it in a way. Leave it alone :)

This needs to be corrected. We as members of IT industry need to make things simpler. First for ourselves and by that we're able to simplify it also for the customers. We need more business thinking and less technical thinking to meet the user expectations better in the future. Things might get in a way more complicated when everything goes virtualized into cloud, but we need to be able to better translate that stuff under the hood into what really matters. Being a passenger for a taxi shouldn't mean you should be able to first understand how the engine in the taxi works and what kind of replacement parts or additional options you can buy for it.  There was another kind of task the passenger had in mind when hiring the taxi and that was getting from place A to B :) This is many times easy to forget.

In the end it always falls down into us, the people, not the technology. Gene Marks writes about this in Forbes article from the CRM application perspective, that is there to help, but if not understood, implemented, sold and used right, will not serve anyone as it was meant to.

Wednesday, September 03, 2014

The battle of giants in the Cloud service platform arena - over $14 Billion to take in 2017

Microsoft, Google, Amazon and cloud services vendor competition
Which one will hit the bullseye with their Cloud service platforms?
Microsoft, Google or Amazon?
Q: 'How is the weather?'
A: 'Well, I think it's going to be CLOUDY'

But it might be that you'll see couple of bigger clouds in the sky rather than scattered smaller clouds. At least when you look what the giants are doing at the moment. According to the IDC press release about the Platfom-as-a-Service (Paas),  the market is going to grow to over $14 billion in 2017. The total market is going to be segmented into six competitive sub-markets: application platform-as-a-service, database platform-as-a-service, integration platform-as-a-service, business process management platform-as-a-service, cloud testing and other platform-as-a-service. And guess who's in the top of the list of taking the biggest shares from it? Microsoft Azure, Google Compute Engine and Amazon Web Services. They are there already. Next to them there are many players wanting to get their share like Oracle and IBM and many other vendors knocking at the door building their cloud based platforms.

This all will change everything. Not only the way we consume IT. But also the way we need to build the infrastructure. It's going to change not only the IT business, bringing a whole new era of services and applications and devices we didn't dream of. But it's hitting also the traditional business. It will eventually close down a lot of companies that are not following their time and developing their offering. Also those totally new ways of doing things with the new technology opportunities will drive some other business areas out of business. Not only happy things will come out from this development. People will need to adapt more and faster than ever before to keep up.

Customers don't necessarily even know they're not ready for the new era and the new usage of IT and data. The networks, the infrastructure, is not there yet. Not even close. The usability, accessibility, availability and cloud security will drive a major role in the future. Who are the companies today capable of building all this that we'll need tomorrow?

The big leading IT companies, the visionaries, will create the new market and the new demand. Many of the present companies are not ready for it. And many vendors we know will either be acquired or just will vanish from the market losing their business. The same thing that happen with Kodak in the camera and film industry. It's really time to wake up. This is the last call to get onto the train.

Monday, July 28, 2014

Remote worker - yeah sure, who are you kidding?

Dimi Doukas blog - working remote on terrace
Working remotely on a terrace - beats the open office, eh?
My mom called me today and asked me 'are you at work or at home?' and when I answered 'both' there was a pause, longer silence where I guessed she was trying to figure it out what I meant. For her I kind of understand it. She's from the era (as she's 70 years) where she hasn't been so used to the idea of someone working at home. Although there has always been professions that has enabled working from your 'home office'. Like tailors for example or shoemakers. Before the era of shops and malls, they usually worked at their home where customers would come to.

Not only the older generation have difficulties in understanding someone working at home. It's like you either work or you're at home. Like they couldn't be the same. I've found this thinking surprisingly common in IT business too. Surely, we sell 'remote worker solutions', but when it comes into working remotely yourself, for some reason lots of people have always some kind of fundamental doubt about it. You get the same kind of question from IT professionals that my mom asked me, 'are you working or at home?'. When you tell someone face to face that you're going home to work remotely or you'll be at home working tomorrow, there's this look in their eye, maybe a grin even. Like you've just told a widely known name for 'you're actually not working but instead doing something else'. Like sunbathing, gardening or just watching TV. Do remote work and you'll do less those things I'd say :) People who drop their gloves at four a clock (or three if they came late) will actually get to do a lot more other stuff than work than the ones carrying their work to their home (or keeping it there). Earlier I had my working corner in our bedroom. As the desk lamp was illuminating the room too much at night, irritating my wife, I bought an illuminated keyboard. I mean... 'how low can you go...?' :)

Even though I've been a remote worker as long as it has been possible, I still think remote work is invented by the employers (if not the devil himself) and not by the employees :) I find myself doing longer days as there's no breaks like driving to work or driving to home. So you'll end up sitting there until the night comes, unless you have a hobby or some event that would drag you out from your seat. And nowadays we have not only the remote working option from home, but with the smart phones and even more with the pads and 3G/4G connected laptops, we can work from anywhere. And we really work from anywhere. From our vacation, from the highway going 80miles per hour, from the movie theatres. So what happens with the technology evolving faster than our thinking, is that we're working more and more 24/7. In the end that's not going to serve anyone. Employees will be more tired, doing less with more time because of that, and the quality of their work will reduce. As in so many other areas, 'less is more' is true here also.  

Sometimes power outages are welcome, they remind that you could really do something else for a while. Having DSL connection isn't always a blessing. When you're passionate about IT, about technology overall, you seem to want to be connected everywhere. Sometimes it just goes over board. The pace is not days, it's not hours, not even minutes, but it's more seconds that we expect to get answers, to be picking up the phone, answering email and chat. I see that in my work where earlier you asked for pricing and expected it to be in couple of days (if the provider of the quote was fast), now we need (or think we do) it in minutes, couple of hours max. We can't go much faster than it is already unless we leave people out of the process altogether. Seeing so many times when we have a hurry for waiting in the end, I'm more often asking myself 'what's the rush?'. Why I need to do pricing in couple of hours if the order still would come maybe in weeks or even Months later. We're just got used to the idea that when someone asks for something we need to get the answer right away. The way of internet generation has learned to behave. If the web page is not opening in 5 seconds, you're already typing another page address. Not everything is priority 1 and I think that we should each of us remind ourselves. Only very rare cases actually is priority 1. And what happens is the old story about the boy who cried the wolf', if you shout too often something that isn't actually true, no-one will take it seriously when you really have a hurry. They just get tired you pushing to get everything this minute when you're not really needing it that fast in the end.

So balancing the working time and your off-duty time and drawing a more solid line between the two is important. We're not paid to be there 24/7 365d per year. It's good to be flexible and do things when needed, but if it starts to be situation every Month, every week or every day, then something is fundamentally wrong and needs to be fixed some other way than burning the candle from both ends all the time. I heard that in Germany it's even common that if you'd have a company phone, you'd leave that into the office when you leave the office. I know that in many other countries, like in Finland in my case, you use the same phone after work as well for you private life. So you'll have the same phone and number all the time. Reachable for both work and personal matters. Leaving the work phone to the office could be a tempting option :)

Will the pace slow down? It really needs to do that. Technology is not going to be a solution for that. Once again we need to realize the fact that we're not evolving as fast as technology. And we're not extension to the technology either. Like with fire, it can be a good servant but a lousy master. The same goes with technology. We just tend to forget that.


Monday, July 14, 2014

IT networks are the bottlenecks that will slow us down

In general the networks are outdated and preventing the development to come
Most of the networks are outdated
and preventing the adaption of the new
technology and applications
Now we're in ethernet era using many times 1Gb connections, 10Gb connections or fibre channel with 32Gb, or even faster connections (like 128Gb fibre channel forecasted to be published in 2016). So where's the problem? That has been taken care, right? Wrong! That served us alright the past decade. What we've seen now for the past years has been the explosion of the mobile devices - mostly phones and now pads. But even with that, we still managed somehow. Because that has been only the first step in the evolution to come. What we'll see next is that the number of devices needing network connection will be something we couldn't foresee. And there's the real problem. In the next coming 5 years we'll be seeing a lots of totally new integrated intelligence installed into various products we didn't think would be possible. The near future already will look like science fiction as we're able to get information from all kinds of things. The AI (artificial intelligence) will be sewed into our clothing, into diapers for babies and for the people needing those in hospital and care home environments. New kind of sensors and applications will be also built into clocks, helmets, driving gears, autos, cups, shoes, tools, toys, bandits, homes, offices, built into wallpaper, in road tarmac. Everywhere. The number of usage is limitless and there comes our problem. Each of those integrated devices will need network connections, mostly wireless one. Even though you'd use something short distance connections like Bluetooth to gather the data to some other device, that won't solve the problem. There will be a HUGE increase in the amount of the data passing through the networks and also the need for speed that the data needs to be transferred.

The current stage of network is mostly outdated. No matter if you've refreshed your network devices this year, it won't be enough. As the design you have in your network, as well as the devices you've bought are most likely already outdated. And the same is true all the way from the smaller companies to the telcos that provide the backbone networks. And the closer the backbone network we go, the more crucial the problem will be. The amount of the investments needed are so big that business companies are not willing to make that investment in advance. So they will be slowing down the development for years and preventing people to adapt to the new technology as fast as it would be possible. 

With this new kind of technology at hand, it will generate whole new business opportunities that was impossible earlier. Only imagination will be the limit, as there are millions and millions of things you can monitor, get data from, and then produce something useful or entertaining with that data. And when we've taken the fist step in that, the next would be combining these solutions and the data together and create more complex and more total solutions where these units talk to each other and produce even more useful information that we can use to make our lives better and more safe.

Now we're there already, in the science fiction movie. It's here today. But our company networks and core networks are not there. Not to mention the networks that we have in our homes and houses we live. I would say it's time to stop saving in the wrong place. In this era of virtualization and clouds, everyone is talking how you should move your servers and applications into the cloud. But even if you'd do that, you'd still need to access it. The data, the pictures, the video needs to pass your local network in a way or another. So you can't escape the fact that your home yard needs to be fixed first. So start with your home or company network, whether the servers and the applications are in your local network or in the cloud. Make sure you have enough capacity to start with. And what's more important is that your plan and the solution you'll get will be scalable and you don't build yourself bottlenecks where everything is efficient until we come to this one spot where it's not - like the internet connection. Once again your network is as good as it's worst (slowest) component. And it's as vulnerable as your high availability options and secondary connections - or the lack of them. So start demanding this, not only from your network, but also from your service provider that provides the connection and the backbone you will be using now and in the future. Make sure they don't jeopardize your future with inadequate network solutions.

The past 20yr the generation in IT has built what we've got now. And that has been a huge leap. But the next 20yr will do and require easily 1000 times more when it comes to the speed, amount of data,  the wider scale of usability on applications, and the complexity of the network we'll be seeing later on. We thought we've seen it all, but we've seen nothing yet. This is just the beginning. What we've seen now is just a scratch of the surface. From now on we'll be needing much longer lifecycle for devices from the manufacturers. We need to get out from this 3yr cycle with hardware. We need to get more scalable and longer lifecycle products that we can use 5-10yr. Products that you don't need to change often, that are modular and can be expanded. Something that we could left there for 10yr without the need to touch them. As the amount of products will increase, we need to start doing things better for them to last longer. The cheap plastic, throwaway kind of products are not sustainable development that we can build on, for both energy and environmental reasons.

So let's start demanding it from the network providers and the manufacturers. Let's make a choice with our feet and with our money. And let's make our networks ready for the future to come.