Posted on November 11th, 2019 - By Matt Parkinson
It’s the final day of Microsoft Ignite 2019 and a significantly quieter conference centre with many people already on travels home. The final day runs up to 2PM however it’s still packed full of 235 sessions none of which are any less important than the previous days, in fact from our own experience it contained one of the most useful sessions of the whole week.
The day started with a session from Sami Laiho on 45 life hacks for Windows OS in 45 minutes. This was perhaps one of the most entertaining sessions of the week although the 45 life hacks ended up at about 14 but still several useful tips and tricks were displayed. The most notable were for debugging in Windows and mapping out memory allocations for processes which for IT admins I would recommend looking into the use of the VM Map tool.
Next up for the day was a session on automating and managing Windows server environments which was perhaps one of the most useful sessions of the week and the one saved til last. The session started by recognising that both traditional server environments such as those on-premises and cloud deployments are both growing and will continue to co-exist for as long as can be imagined at the moment and Microsoft are working on providing more access to managing traditional workloads by attaching them to Azure tooling.
If you followed our Oracle coverage a couple of months ago this is something you may recall also being echoed by Oracle and that the line between on-premises and cloud is merging to enable you to run your workloads wherever suits you best.
The session went on to cover that cloud is no longer just about infrastructure but about services and management capabilities of which many were about to be covered. There was also talk on how IT management is shifting from being entirely controlled by IT admins to becoming more an offering of guidance and governance allowing different departments to take care of their own applications to a certain extent.
To this end Microsoft are working on creating cloud-based equivalents of on-premises tools but enhancing them in the process to take advantage of the cloud capabilities and advantages introducing what they are calling co-managed solutions. These tools and their cloud counterparts are covered on the slide below.
The most note worthy of these tools is the Azure Update Management services which allows unified patch management of Windows & Linux systems both on-premises and in any cloud environment whether it’s Microsoft Azure or another provider. The OS and cloud agnostic emphasis of this tool is great for providing a single pane of glass view of an entire organisations infrastructure and the compliance status of devices within that infrastructure. The best thing about this tool is that it’s free providing you keep your log management below 5GB however that would require quite a large infrastructure under management to exceed.
The other tool worth mentioning is change tracking which allows you to continually monitor devices within your infrastructure for changes to software, services, riles and registries and again is available for both Azure based services and other environments, a theme which was common amongst all of the tools shown. What’s great about change tracking is that it runs in real-time and reports it changes to the Azure portal so that in the event of an IT outage it’s possible to see what changed leading up to it and takes away some of the unknowns in certain outages.
The last of the tools to mention is the Azure Best Practices Analyser which although current in a closed beta will soon be available and will analyse your services connected to Azure and how they are configured to give you actionable steps on how to modify your services to confirm to best practices.
Those who have been around the Microsoft world for some time will know that best practices have always been something that can be difficult to achieve in real-world environments especially when business demands specify otherwise however this tool helps provide a path to getting as close to best practices as possible. The actionable insights help to show progress towards best practices and information on why you should change things. All the information is based off of a white paper on best practices which has been around for a while and underwent a number of months of assessment to write but now made easy to consume in this new tool.
That’s it for Microsoft Ignite 2019 as the event comes to a close and we ourselves set out on our travels back to the UK. It’s been clear that Microsoft are heavily investing in hybrid cloud as we’ve also seen from Oracle and Amazon however Microsoft appear to have a bit of the upper hand in also being cloud agnostic and allowing their tools to manage other cloud environments as well. Only a few years ago most people would have never imagined a world where both Redhat and Oracle had booths at a Microsoft conference and vice versa.
Microsoft have sent a clear message that they want to be the software vendor for all operating systems and all environments opening doors to their competitors which were once firmly closed. Unlike other cloud conferences they also don’t poke fun at their competitors and embrace them instead which we believe puts Microsoft in the front seat for cloud transformation.
Microsoft put forward a clear emphasis on cyber security and process automation as well this year which have been demonstrated to be 2 of the top concerns by businesses worldwide and both areas are where most of our action points for the next few weeks lie. We’re excited to be bringing more secure and more cost-effective solutions to our clients through protecting their data in more enhanced ways and improving performance via automation with tools such as Power Automate.
It was a great week and lots learnt by the team here which we’re excited to bring back to the UK and speak to everyone about.
Posted on November 8th, 2019 - By Matt Parkinson
The last full day of Microsoft Ignite 2019 is upon us but the content hasn’t let up and doesn’t show any sight of doing so with another half day tomorrow with lots of key topics still to be discussed. Today was another day packed primarily of more content on security and vulnerability management along with some development sessions which we find crucial for being able to create solutions where one does not already exist.
The day started with a session on new capabilities in threat and vulnerability management which has been a common topic through the week as cyber security is seen as one of the biggest threats to businesses looking to adopt or grow their cloud strategy. The session highlighted this further with summarising that the spend on cyber security and particularly threat and vulnerability management is growing and is a sizeable expense in many organisations but most are still vulnerable.
It was also noted that vulnerability management can often be driven by what is given the most hype in the media or the vulnerabilities that carry the highest severity without much assessment of the vulnerability within the context of the organisation. This context is important in understanding whether there are other mitigating factors to the vulnerability or whether the systems that are vulnerable are of low value to the organisation. In assessing vulnerabilities within the context of the organisation we can end up with priorities that are very different and Microsoft are aiming to address this with their TLV scoring approach of which the factors are shown below and will produce a score out of 100 for the priority to address the threat within the context.
Next up for the day was a great session on changes in the Windows 10 and Office 365 update deployment including how and why to stay current. The session started by making the point on why you should stay current which put security forward as the biggest reason however also cited that a lot of work goes into increasing stability and performance so by installing the latest updates you can improve your end-user experience.
It’s quite well known that end-users and administrators dread Windows updates because of the history of compatibility issues, long downtime first thing in the morning and issues with even installing the updates. Microsoft recognise the history of updates has not been good and has been putting a lot of time into solving these issues.
Microsoft were reporting recent changes to the way updates are delivered having reduced the mean install time from 82 minutes to 28 minutes partially by decreasing the payload sizes but also but optimising to do what’s possible with the system online and reducing the number of reboots. They have also made updates install on the shutdown rather than boot so that when you turn your PC off it installs the updates then rather than having a 20-30 minute wait first thing in the morning as many people have grown to dread.
Microsoft also introduce their app assure program to address compatibility issues with the upgrade from Windows 7 to Windows 10 and for updates during Windows 10. The app assure program means that Microsoft guarantee that an application that previously worked will continue to work and if it doesn’t you can call them and they will fix it for you or fix Windows. This program has resulted in a 99.8% compatibility figure which is quite impressive to those that have been through numerous updates in the past and suffered from compatibility issues.
Lastly on updates, Microsoft have also put work into extra tools to report on why an update installation has failed and introduced cloud based recovery which is able to repair the installation using files from the Microsoft cloud. All of these improvements to the update experience may take some time before users start to trust it with how much of a negative experience users have had in the past however it certainly seems like significant steps forward have been taken.
The last session of note for the day was on Office 365 advanced threat protection or ATP for short. Office 365 ATP builds on the previous iterations of ATP for Exchange online but extends it to wider support for the Office 365 suite and offers a range of new features including best practices analysis and automated recommendations on how to configure ATP for the best results. What we found the most impressive from this session was the statistics on the sheer volume of data that Microsoft processes to prevent attacks and just how many they are preventing especially in the zero-day area. You can view these stats below which go to show why Microsoft’s threat and vulnerability programs are succeeding with the amount of data that they are able to build a picture from.**AWAITING SLIDE DECKS**
That’s it for day 4 of Microsoft Ignite but we’ll be back for the final morning of coverage from Day 5 tomorrow including a summary and round-up of the entire week so please be sure to check back tomorrow.
Posted on November 7th, 2019 - By Matt Parkinson
Day 3 of Microsoft Ignite 2019 has been and gone and we’re now over halfway through the annual I.T. pro conference. Lots of information has been had but there is still plenty more to come plus the highly anticipated attendee celebration party tomorrow but first a round-up of today’s events!
Day 3 was packed full of information on security and compliance as well as the Power suite with things starting off with a session covering the changes in Microsoft Secure Score. This session was quick to make an impact with important statistics such as that 93% of breaches could be stopped with just basic measures but did also recognise that cyber hygiene is hard to maintain due to an ever-changing landscape.
It’s also important to note that 99% of all attempted breaches are stopped however that remaining 1% is still a significant amount of breaches when considered in a global scale. We also have to remember that as professionals responsible for security systems we have to aim to be successful 100% of the time whereas an attacker only has to succeed once so the odds are stacked against us.
It’s these heavily stacked odds that Microsoft Secure score aims to help with, a service which was first introduced around 2 years ago and has just gone into a new preview phase with several advancements and a new interface. These advancements aim to make secure score simpler and more tailored to each tenant by using artificial intelligence and machine learning to highlight the most important security changes to make for that tenant.
Microsoft see secure score as such an important tool in managing the threat landscape that they are also working on making it extensible so that third party providers can also bring their own recommendations for their own applications into secure score to provide a single pane of glass for an entire organisations security health.
Further on in the day we heard more about security in relation to Microsoft’s unified endpoint security management tools which aim to bridge the differences between SecOps and I.T. Admins to allow them to work more closely to achieve a more secure environment. In this session they also highlighted that Microsoft Defender ATP can replace traditional AV providers and is now one of the leading endpoint protection options in Gartner’s magic quadrant. Combining this with their ability to adapt the Windows operating system and now providing defender as a built-in option it’s staring to look like it should become the go-to option for endpoint defence.
Microsoft have also developed a new feature for Windows in tamper protection which allows global administrators to set policies with a protected payload that prevents local administrators from disabling Microsoft Defender which can be a common source of unprotected devices and potential cyber breach targets.
Last for the day were several sessions on various parts of the power suite but primarily Power BI and Microsoft Flow which has now been re-branded as Power Automate. These tools are essential in helping organisations and individuals to achieve more which is part of Microsoft’s mission statement.
Power BI and Power Automate have been around for some time however Microsoft are making efforts to make the tools more useful than ever and more accessible to a wider user base by making many functions no code and wysiwyg editor based. The aim is to allow anybody that wants to automate tasks able to do so or to allow them to extract value from a data set and tell a story with it. The PowerBI sessions were quick to demonstrate how much you could do with just a few clicks and you can view this brief 20 minute session as an introduction to Power BI below once the content has been uploaded.https://myignite.techcommunity.microsoft.com/sessions/83508
That’s it for today’s round-up and we hope you’ve enjoyed reading everything from the week so far but we’ll be back tomorrow to cover the last full day of sessions.
Posted on November 6th, 2019 - By Matt Parkinson
It’s the end of day 2 of Microsoft Ignite 2019 in Orlando and we’ve been back at the conference centre for an action-packed day of sessions on all things Microsoft.
The day kicked off with a session on Windows Terminal, a new UI combining PowerShell and command prompt amongst access to other shells. Windows Terminal is designed to bring a customisable UI which has always been lacking from PowerShell and command prompt but also brings tabs and pre-defined profiles. Profiles were a big talking point as they are json defined and can be distributed amongst multiple devices or users to allow a team to work from the same configuration set.
Next up was a session on cloud backup and disaster recovery which addressed common mistakes on the approach that people are taking now that workloads are in cloud and hybrid environments. One of the biggest misconceptions that has been mentioned at numerous software conferences recently is that although the data is in a cloud environment it remains your responsibility. This means if there is a loss of data at the cloud provider or a data breach via the cloud provider it is yourself and not the cloud provider that is ultimately responsible and the one that may be hit with fines from the ICO. The image below represents the responsibilities that the client keeps even in a cloud environment and is not unique to Microsoft solutions but applicable to most cloud providers.
Another common issue when it comes to backup and DR is the ever-growing complexity due to multi-cloud and hybrid environments resulting in data potentially becoming fragmented in different places rather than the traditional file share approach where all your data was held in a single silo. This means it’s more important to have a full backup and DR strategy to ensure that all your data silos are covered and protected.
Furthermore, a full backup and DR strategy will implement disaster recovery testing procedures and frequencies to ensure that when you need to recover your data it’s possible. It’s reported in a survey by Cohesity that 5% of companies don’t have a DR plan at all, 25% have never tested their DR plan and 34% experienced outages due to problems with the DR plans. These statistics also lead on to the fact that primarily we design for performing backups rather than recovery when really, it’s the recovery aspect that is the important part that we should be taking aim at instead. It’s more important that we can recover data quickly and effectively than perform backups quickly however designing for recovery often also has a direct impact on the backup effectiveness as well but the same can’t always be said the other way around.
Moving on to the final notable session for today was the discussion of zero trust and how it is the go-to approach for making systems more secure in the modern world of I.T. and cloud. Many people have not heard of zero trust until recently but the concept has been around in some form for a while originally based on de-perimterisation.
Zero trust is a mindset above anything else treating every access attempt as if it were from an untrusted network or user and seeking to verify each access attempt in real-time. This starts with an approach which assumes pervasive risk where everything is considered open to the internet and at risk until proven otherwise.
There are 3 key principles of zero trust which are to verify each access attempt explicitly, use least privileged access and to assume breach. What these mean are that each time something is accessed we should seek to verify that attempt is valid and that we should use more than one source of verify that access attempt is valid. Least privileged access then means on a successful access attempt those credentials only have access to what they need to do. Lastly assume breach is the assumption that at some point there is going to be a breach or that the attempt could be a breach and containment of that access attempt is key to prevention of wider security issues.
In summary it’s been another information packed day and lots of key points to be discussed further particularly around data ownership and responsibility. We’ll be back at the Orange County Convention Center tomorrow for day 3 of Microsoft Ignite 2019 so be sure to check back!
Posted on November 5th, 2019 - By Matt Parkinson
Microsoft Ignite 2019 kicked off today at the Orange County Convention Center in Orlando and 2 of our team were there to cover the days events. The day was set to start with a keynote from Microsoft CEO Satya Nadella followed by a series of tech keynotes from differing learning paths.
The keynote from Satya started with some incredible statistics of the growth of connected devices to reach 50 billion by 2030 and the total size of data reaching 175 zetabtyes by 2025.
To house these connected devices and their data Microsoft is committing to building new data centres with 100% renewable energy and zero waste, a strategy that will also be rolled out to existing data centres across their 54 different regions around the world.
Further on in the keynote Microsoft mirrored other large hyper-scale cloud providers in making their commitment towards hybrid and on-premises infrastructure which Satya dubbed the new era of hybrid cloud for which a range of new product announcements followed.
Amongst these new announcements was Azure Arc, a control plane for multi-cloud and multi-edge scenarios allowing the azure stack to be extended into companies own data centres and allowing them to leverage azure features in on-premises environments. Azure Arc also supported other hyper-scale cloud environments such as AWS which shows Microsoft is ever adopting a new approach to competitors in embracing their technologies rather than having to make a choice between vendors.
Azure Synapse and Azure Quantum were further announcements on the future of technology with accessible quantum computing on azure which combined with synampse are designed to make complex queries ever faster to allow everyone to consume data better and faster.
Moving away from the Azure features a strong emphasis was made on cyber-security with the cost of breaches last year noted to have totalled over $1 trillion. This is a statistic to which Microsoft are making huge efforts to combat with enhancements in products such as advanced threat protection and Microsoft secure score and compliance score frameworks which analyse tenant environments and make priority based suggestions on how to make things both more secure and meet compliance. Security and compliance made easy was really the big message that Microsoft were trying to get across.
Further on in the day there were tech keynotes in various areas for which security was our primary focus in which we learnt about the three pillars of security which are identity, compliance and security. Within each of these pillars Microsoft are providing tools to achieve greater security across all organisations including things such as single sign-on and conditional access for applications such as DocuSign, GoToMeeting and even Facebook.
These features and keeping devices secure often came back to enabling of modern device management and it’s something Microsoft have made significant advancements on over the past year and are keen to enable everyone to be more secure noting that 99% of breaches could have been prevented by features already available such as multi-factor authentication and conditional access. Utilising cloud technologies to make all devices more secure including traditional on-premises applications was high on the agenda and demonstrated throughout.
Finally for the day was the opening of the hub, something new for 2019 which incorporated the traditional expo floor with more learning theatres and partner products. This was really a fantastic experience with easy access to Microsoft professionals to get into details on specific issues around implementations and on-going product problems. If there is something that you would like us to discuss on your behalf then be sure to let us know.
That’s it for our round-up of day 1 of Microsoft Ignite and far too much to mention than we can fit in this blog post so be sure to contact us to hear more tailor made to your own environment and how to make the most of existing and new products.
Posted on November 4th, 2019 - By Matt Parkinson
This week is the biggest week in our annual conference calendar as we head to Microsoft Ignite in Orlando for a week filled of all things Microsoft. We’ve been attending Microsoft Ignite since it’s inception in 2014 when it changed from what used to be known as TechEd and the event is ever changing and growing which makes each year refreshing.
The reason we attend every year is not just for the latest updates on Microsoft products but also because it’s one of the best conferences for other business insights and personal development sessions. This makes it an incredibly valuable event and probably one of the best organised which is why this year 2 of us will be travelling to Ignite to be able to cover more of the event than ever before.
Microsoft Ignite is based at the Orange County Convention Center usually with upwards of 30,000 attendees and over 2000 different sessions covering various things from products to how to develop your career in I.T. The conference center is huge with some sessions catering for several thousand attendees and a lot of walking takes place across the week with a previous record of 26,000 steps in a single day.
In addition to sessions there is also a large expo floor with perhaps some of the best giveaways for any conference ranging from drones, surfaces and Xbox’s. You also can’t be a conference attendee without coming home with some swag such as t-shirts, stress balls, portable chargers and so on which makes fun for fitting it in the travel bag home.
Although there is a fun element to the conference the reason we’re really taking such a large chunk of time out is for the value that we’re able to bring back for our clients. During the normal day to day life of work it’s easy to focus on your existing product set only and these conferences allow us to expand our vision of the future of I.T. services and how we can help our clients to achieve more.
We’ll be working to bring you the latest news and updates on this blog through the week so please check back each day. We’ll also be bringing some video snippets of the conference this year so you can experience it through us in more detail.
Once the conference has finished we’ll be working to co-ordinate action points for clients where relevant on both achieving more with existing products and an insight into what products are coming that could be game changers.
Posted on September 20th, 2019 - By Matt Parkinson
It’s the 4th and final day from the Moscone Center in downtown San Francisco for Oracle OpenWorld. Despite being the final day and noticeably fewer people than previous days a packed schedule awaited with a further 6 sessions primarily covering information security utilising Oracle products.
When we talk about information security we categorise this into 3 areas which are availability, integrity and confidentiality so we’re looking at the protection of the data, the accuracy and consistency of the data, and last but not least the ability to access that data. This means that we’re covering sessions on high availability and disaster recovery in addition to data protection in today’s run-down of events.
The first session of the day kicked off with new features of enterprise manager to capture important events and notifications from Oracle Database systems. Enterprise manager acts as a single pane of glass for viewing multiple database environments whether they are on-premises, in another cloud, or in Oracle cloud.
Enterprise manager comes with some default monitoring templates but custom templates can also be created and pushed out to all of the monitored hosts to make deployment and management simple and easy to change.
New in enterprise manager is the ability to group events into a single notification rather than receiving an alert for each. This is particularly useful during maintenance windows such as when a node is taken offline and it is expected that several the related monitors are going to trigger. Another new feature is the detection of runaway SQL queries with the ability to automatically kill the runaway SQL. This feature will work looking for hung processes as well as SQL queries consuming more resources than defined and will take corrective actions to rectify them. This is something that we monitor separately ourselves but it will be a great addition to enterprise manager as well.
Next up for the day was covering the most important security features of Oracle database and how to keep data secure. This session primarily covered features that have been around for a while but either not configured to their full use or that are little known so people are just not using them. Oracle talk about 3 fundamental steps in security of the database which is to assess your current state, detect improper access to data and to prevent improper access to data.
The main features to take away from this session that we will be working to roll-out to our Oracle databases is the enabling of unified audit which combines all of the current 7 audit locations into a single audit trail. The other feature is network encryption which allows for client connections to be encrypted and data protected better whilst in transit. There were also a number of other tuning steps taken from this session and we will be working to roll out the applicable features to customers in the coming weeks.
The final session for the day and to mention is the number of tools that Oracle are making available to assess the database security and availability options. One of the key tools talked about was the database security assessment tool which assesses the configuration, identified risky users, discovers sensitive data and provides assessment reports which can be used to tune the system. We will be running this on all of our environments and making recommendations based on the reports for increasing both data security and availability.
That’s it for Oracle OpenWorld 2019 and my time reporting on the information available from the event. It’s been a fantastic conference and a lot learnt which we are looking forward to bringing to you in the coming weeks. There is also a lot of information that hasn’t been mentioned to avoid too much to read however if you would like to know more about the event and additional information please contact us using the contact page or phone number and we will be happy to discuss things relative to your environment.
Posted on September 19th, 2019 - By Matt Parkinson
It’s day 3 of Oracle OpenWorld in San Francisco and we’re back gathering all of the latest information for you from Oracle’s annual conference being held at the Moscone Center. Today we will be bringing you information primarily around Oracle Linux which sits at the core of many of the Oracle products.
The first notable session of the day was a solution keynote speech on Oracle’s infrastructure strategy for the cloud and on-premises systems. The session got off to a start with re-iterating what’s been a common theme each day this week in that whether you want to run in Oracle Cloud Infrastructure, In other clouds or on-premises then the entire suite of products are available and in the exact same format as on the Oracle Cloud. One the products used to demonstrate this was Exadata Gen 2 which is used as the foundation for OCI with the exact same hardware and software available for cloud at customer from Oracle. The tag line “we use the same as our customers do” was also made reference to as to further re-enforce this ethos.
The solution keynote also covered a number of innovations in Oracle Linux with the most important being Oracle Autonomous Linux which was officially announced on Monday during Larry Ellison’s keynote speech. This new version of Oracle Linux primarily addresses security concerns by proactively addressing security issues and ensuring that the operating system stays up to date by itself and best of all without any updates or downtime required.
In addition to kernel level updates Oracle Autonomous Linux is also able to patch user space packages such as previously high-profile vulnerabilities in glibc and openssl. Not only are the vulnerabilities patched but tripwires are also inserted so that should a user or process try to exploit the vulnerability an audit entry is created and can be notified to the system maintainer. The biggest aim with Oracle Autonomous Linux is moving systems to always being up to date and as secure as possible by removing the require human labour and thus the possibility of human error.
Later on in the day we continued with further information on Oracle Linux but this time on optimisation of the system for getting the most performance out of Oracle Database. This session primarily addressed memory management such as the use of huge pages and customising the system swapiness to ensure the database gets as much use of the available memory as possible. This session also covered some additional support tools which can be used for gathering system information for troubleshooting. We’ll be working to bring all of the key points from this session to our Oracle environments as soon as possible.
In between sessions we also took some time to visit the exhibition zone and talk with both current and potential future partners and providers such as DBVisit, Nutanix and Solarwinds. Nutanix easily had the best area in the exhibition zone, and not because of anything technical, because they had puppies! All of the puppies had been rescued from trouble and were there for some TLC from many willing attendees and were being extremely well looked after.
Day 3 also finishes with the conference attendee party known as CloudFest which is being held at the Chase Center and features performances from John Mayer and Flo Rida. Attendee party’s are always a great way to reflect on the information from the week and catch up with new and old connections over a few beers but unfortunately tend to fall before all the work is done and the final day can be a bit of a struggle in the morning!
We’ll be back tomorrow for the 4th and final day of Oracle OpenWorld including our round-up of the whole week and the most exciting things we’ll be bringing to you in the coming weeks that we’ve been introduced to this week.
Posted on September 18th, 2019 - By Matt Parkinson
We were back at the Moscone Center in San Francisco today attending Oracle OpenWorld 2019 and obtaining the latest information on all things Oracle. The day was planned to be quite variant with information on MySQL, Oracle Database and Oracle Cloud across 6 different sessions.
The day started with a tutorial on InnoDB clusters which is a relatively new feature in MySQL since Oracle purchased it. InnoDB Clusters offer high availability on MySQL databases with an ease of configuration ethos which was clear to see from the demo’s given. What’s best about InnoDB Clusters is that they are even included in the community edition of MySQL with the 3 main components of the cluster system also being open source.
The number of websites that run MySQL is phenomenal so being able to add an enterprise feature in high availability to protect a database is a huge advantage for many small and medium businesses that rely on MySQL for their databases. What’s more is that in the demos the downtime for a failure was 5 seconds which for a community database system is incredible.
The next big item for the day was Oracle Database In-Memory which allows data in tables to be held in-memory for faster access than the storage system. The in-memory options significantly improve database performance and combined with automatic in-memory management from Oracle 18c it’s getting much easier to create significantly faster databases on the same hardware.
The last item to note for the day is new functionality for RMAN to allow backups to the Oracle Cloud using the DB Backup Cloud Module. This feature allows easy archival of backups offsite and into the Oracle cloud block storage platform which can help to meet a number of compliance requirements such as for an end of quarter or end of year archive without impacting on space on the local RMAN servers.
What was interesting about the RMAN backups to the cloud is that it’s a clear way to make the most of cloud alongside existing environments or in multi-cloud environments. This is something that we are always keen to see as making the best of each environment to come up with an overall multi-cloud solution is something that we’re always interested in achieving for our clients. Finally for the day, Oracle made clear in a keynote again that they are working to deliver all of their products to the location you want to consume it and are not just pushing their own cloud services. Although Microsoft have also started to take this stance in the past couple of years it’s not as far ahead as Oracle appear to be.
Posted on September 17th, 2019 - By Matt Parkinson
September to December is what we like to know as conference season with several annual conferences and tech events taking place in these months. The most notable in recent years has been Microsoft Ignite which typically takes place in the last 2 weeks of September, or at least for the past few years.
This year Microsoft Ignite has been moved til November which has paved the way for us to attend one of our other biggest product’s conference, Oracle OpenWorld, which kicked off today in San Francisco and our technical director Matt Parkinson is there to pick up the latest news and developments that will shape our Oracle services in the coming year.
Oracle has been one of our fastest growing product lines in recent years and you may have also seen that we’ve recently been accepted onto the G-Cloud framework for our Oracle services which makes attendance to the conference this year even more beneficial.
Part of our managed services offerings is that we attend these events for our customers to obtain information that is relevant to them and ultimately us to move services forwards together. We’ll be working hard to bring the latest updates in a new blog post every day this week from the conference so be sure to check back each day for the latest post. You can also follow the hashtag #OOW19 on twitter for information from ourselves and other Oracle partners through the week.
Day 1 got off to a packed start with 6 general sessions attended and a keynote speech from Larry Ellison, co-founder and CTO of Oracle. The sessions today covered a variety of different topics from new features in Oracle Database 19 to migration paths onto Oracle Cloud Infrastructure.
One of the early sessions covered 18 different methods to move to Oracle Cloud Infrastructure which varied in complexity, downtime and features. The key takeaway was that a lot of the similar methods for doing migrations on other infrastructure continue through to OCI making it easy to migrate. It was also clear that dependent on your infrastructure and requirements there is an option to suit.
Another key session today was around innovation of database in 12c and 18c which discussed some of the features available in newer versions of Oracle Database. The key points to note were the increased security ethos with tools like privilege capture which can be used to analyse privileges and revoke privileges which are not required. In addition the introduction of the unified audit trail makes it far easier to keep set policies for auditing and to review the resulting logs.
The final thing to touch on today was the keynote speech in which several new announcements were made such as Oracle Autonomous Linux being released which is reported as the world’s first fully autonomous operating system. Oracle Autonomous Linux performs a number of system management tasks using machine learning to learn about the system and it’s function and adapting the configuration to suit. The aim is to remove human interaction from the management of the servers and reduce the risk of an accidental configuration change or unexpected result causing downtime.
What was clear from the keynote as well was that although Oracle’s preference is now Oracle Cloud they recognise the need to allow their products to be run in the environment that the client desires. This ultimately means there is a lot of support for Oracle products in other environments such as other clouds, on-premises systems or even a hybrid scenario. This was also further enhanced by the announcement that Oracle are aiming to bring the autonomous database and other generation 2 cloud features to their customer hosted solutions. This allows you to run the same environments and products as Oracle Cloud Infrastructure but in any environment.
Lastly there were a number of technical configuration changes gathered from today and some new and old features which are starting to form part of our action points for implementing after the conference to make our Oracle environments greater. One of our account managers will be in touch with any Oracle customers after the conference to discuss these action points and rolling them out to you.