Quantcast
Channel: SAP HANA and In-Memory Computing
Viewing all 927 articles
Browse latest View live

SAP HANA Cloud Integration - POV 2 (of 2)

$
0
0

This is in continuity of my first blog about SAP HANA Cloud Integration (HCI) where I share my perspective (or you may say randomly put all the thoughts swirling in my head) of this evolving product.

 

Connectivity:

HCI currently has three connectivity option (lets call it 'adapter' as in SAP PI terminology) - IDoc, SOAP and SFTP.

adapters.jpg

SAP is soon planning to add SuccessFactors, REST and a few more adapter. I think there is a lot of room for improvement in this area.

 

Opinion:

  • The capabilities of currently available adapters should be extended. e.g. for SFTP adding archiving, dynamic configuration.
  • Do not expect all the PI adapters to be available with HCI. Remember, this is cloud environment. e.g. no customer will (or should) allow unsecure JDBC database update through a cloud.
  • Although SAP is planning to deliver SuccessFactors (SF) adapter, make sure you understand that SF can even currently be integrated seamlessly by HCI using SOAP and SFTP adapter.

 

Monitoring:

Most of the monitoring capabilities for HCI lies within the 'Integration Operations' perspective of the Eclispe development environment. For each tenant (in simple words customers), a separate agent is available which will be used to track all the messages passing through HCI. For each message, the detailed logs can be seen for the route traversed by each message and technical communication made with the external systems. However, this will be hidden from the customers and will completely be managed currently by SAP. SAP will act as per the SLAs defined with the customer.

 

Opinion:

  • More information on Service Provider Cockpit should be made available. I know it has some alerting capability and therefore I think the access to this cockpit is limited to SAP. If that is correct, I have one more point below.
  • As I understand, customers have to contact SAP for details of these messages in case of any issues. In my opinion, SAP should provide a tool or access for the customers to have an ability to view the current status of message processing. There might be customers who don't really need it, but a good number of customers would probably be interested in it.

 

Apart from this standard monitoring, applications like SuccessFactors have their own monitoring capability to view the exchanged messages:

SF.jpg

For FSN, SAP BackEnd systems should have FSN Connector (ABAP add-on) installed. FSN Connector has a 'Connector Monitor' (accessed via a transaction) which helps in providing additional monitoring capabilities in ECC BackEnd with details of all inbound and outbound messages. For more details on FSN, see this.

 

 

B2B Integration:

B2B support in SAP Process Orchestration (or PI) has become pretty powerful since the inception of the B2B adapter suite and related tools last year. Even before, the Seebuger suite of adapters with PI were providing pretty significant B2B connectivity options. A similar capability is therefore expected from HCI by the existing and potential customers. There is no official B2B support in HCI. I used the word 'official' because the SFTP adapter can still be used as a connectivity option for B2B messages provided you don't want to get crazy with the mappings. Good news is that SAP is planning to provide B2B connectivity options with HCI. However, SAP hasn't clarified (or may be I missed) many things in this area:

  • Will it be built from scratch.
  • Will it use any of the PI content?
  • Remember Crossgate acquisition? What is happening in that front? I haven't seen SAP providing any future direction of this acquisition. There was not even a single session about Crossgate at TechEd this year. This makes me feel SAP will slowly move away from Crossgate and won't invest much in future development of its offerings.
  • Would be it be limited to Ariba content planned to Q1 next year?

Opinion:

  • It is good to have multiple offerings for different types of customers. I understand that 'one size fits all' couldn't work here for B2B. However, SAP should provide a direction where they are heading with B2B offerings and a set of recommendation for potential customers before these customer pick a wrong path.

 

HCI Advantages (you probably don't realize)

As I did, you might already be comparing HCI Process Integration and Data Integration capabilities with existing on-Premise solutions like PI or BODS. When comparing with established products, you may obviously find some features missing, however, there are and could be some advantages that you might have missed so far:

  • The light weight nature of HCI may be used in future for mobile based communication for non-SAP back-end systems. SAP has already planned support for RESTful services in near future.
  • The future releases and upgrades will have near zero down time for the businesses. Customer will be hidden from the underlying product changes. This means no longer EhP, SP, patch work.
  • Not only externally, but your communication within HCI is completely secure as well. Unlike PI, the data persistent at the pipeline steps in HCI are encrypted. Of course they can be decrypted in HCI if required, but that will need some additional certificate management at HCI.
  • The WebUI available for Data Integration provides an easy way to build and configure scenario reducing need of a specific technology expert.
  • The multi-tenancy, in simple terms, will completely separate one customer with another. The tenant based architecture provide dedicated resources to each tenant or customer.
  • The obvious other advantages of cloud based offering - like no maintenance, less implementation time, failovers etc.

 

Possibly in Future:

There are a few things that I think SAP should consider (if they are not already considering):

  • HCI is currently hosted and supported by SAP only. This should in my opinion be opened up for Partner once SAP thinks that the offering has stabilized.
  • There doesn't seems to be any plan for providing Business Process Management (BPM) capabilities on HCI. Probably SAP doesn't have a business case for it yet. However, as I see, it has some strong potential in future.

 

To Conclude:

HCI can currently be used as a complementary offering along with your existing middleware (on premise or cloud based) or ETL tool. The lightweight nature of HCI enables several new dimensions for process and data integration. There are already clients using HCI to provision data to HANA along with keeping their existing ETL tool in the landscape. However, to present HCI as a any-to-any Cloud Integration solution, it would take HCI some release iterations. To SAP's credit, you will see more and more sessions (or slides during the sessions) durig SAP event like TechEd talking somewhat about SAP's recommendation for integration solution but there is definitely some more work to do in that front. For the new customers introducing integration solution in their landscape, it would be interesting to see how SAP positions HCI in future among its integration offerings.

 

 

P.S: I think SAP should 'direct' community and post most of their HCI related blogs and document (where relevant) in the 1 space on SCN. I have already seen some blogs under 'SAP HANA and In-Memory Business Data Management', 'Enterprise Information Management', 'SAP HANA Cloud Platform Developer Center', 'SAP Cloud Computing'. Whichever it is, there should be one. I am using 'SAP HANA and In-Memory Business Data Management' without any real good reason. I hope someone can comment on it and one of the Moderators could move it all to one place.


Big Data drives next-generation business

$
0
0

If you thought the Internet was already integral to business, just wait. From information source to social network… now the ‘Internet of Things’ is poised to transform our lives.

 

The world already has more Internet devices than people. And the number is growing, adding exponentially to Big Data. We’re not just talking about computers and smartphones, but ‘smart’ boxes. That means interacting directly over the Internet with anything and potentially everything, from a promotional flyer to the shirt on your back, from a fridge to the food it contains.

 

It might be an application that allows you to wave your phone in front of a door for keyless entry. Or vending machines that schedule their own replenishment. Or even intelligent streetlights that report defects for quick repair. All of these examples streamline processes, represent considerable value and suggest huge potential.

 

Of course this all means bigger and bigger data, which in turn means your data management strategy is more important than ever. It’s time to start thinking about Big Data for next-generation business through a 21st century lens. This ultimately calls for a data platform that enables you to acquire and store petabytes of data from any source - whether structured, unstructured, machine or human data – and then analyze and visualize data using advanced algorithms. According to a recent study by IDG Research Services, more than half of IT leaders pointed to an integrated solution as the best fit for their organizations.

 

They are looking for the ability to process, analyze and deliver quick, complete and accurate data to any user or application. All from a single, unified enterprise data environment. And all in real-time.

 

Which opens the door to developments such as paying with your cellphone, presence-based advertising and smart clothing that monitors your health.

 

The fact is, by transacting business in real-time, you’re faster, smarter, leaner—and ten steps ahead of the competition.

 

You can learn more at www.sap.com/realtime_data/our_experts or join the conversation #redefinedata

New Hybris-based HANA Marketplace (“Beta”) emerges

$
0
0

I was recently looking at some of the new of the offers on the HANA Marketplace when I noticed a small link in the upper right hand corner announcing a new marketplace.

 

image001.jpg

 

 

After clicking on the link, I started exploring the new site. It looks good and has some improved features such as being able to search the entire marketplace.

 

As a techie, however, I wanted to find out more about the technical foundation of the marketplace and looked at the html source of the page.

 

What was interesting was that I found references to Hybris in the code. 

 

image002.jpg

 

Hybris was just acquired by SAP and here it was already being used in a marketplace. Impressive. This was the first example I’d seen of SAP really using its newly acquired e-commerce framework in a productive setting.

 

Despite this promising first step, there is still more work to done in this area:

 

Examples:

 

  • One single marketplace. The ability to buy and sell HANA Cloud Platform extensions (for example, for SuccessFactors) has just been released. I’d like to see these extensions here as well – though an in-app marketplace is probably more relevant
  • A mobile application for these marketplaces. Hybris has excellent mobile e-commerce solutions, what about reusing those?

 

Update: A recent video with Aiaz Kazai provides additional material about the new marketplace.

Big Data Geek - Is it getting warmer in Virginia - NOAA Hourly Climate Data - Part 2

$
0
0

So I discussed loading the data from NOAA's Hourly Climate Data FTP archive into SAP HANA and SAP Lumira in Big Data Geek - Finding and Loading NOAA Hourly Climate Data - Part 1. Since then, a few days have passed and the rest of the data got downloaded.

 

Here are the facts!

 

- 500,000 uncompressed sensor files and 500GB

- 335GB of CSV files, once processed

- 2.5bn sensor readings since 1901

- 82GB of Hana Data

- 31,000 sensor locations in 288 countries

 

Wow. Well Tammy Powlas asked me about Global Warming, and so I used SAP Lumira to find out whether temperatures have been increasing in Virginia, where she lives, since 1901. You will see in this video, just how fast SAP HANA is to ask complex questions. Here are a few facts about the data model:

 

- We aggregate all information on the fly. There are no caches, indexes, aggregates and there is no cheating. The video you see is all live data [edit: yes, all 2.5bn sensor readings are loaded!].

- I haven't done any data cleansing. You can see this early on because we have to do a bit of cleansing in Lumira. This is real-world, dirty data.

- HANA has a very clever time hierarchy which means we can easily turn timestamps into aggregated dates like Year, Month, Hour.

- SAP Lumira has clever geographic enrichments which means we can load Country and Region hierarchies from SAP HANA really easily and quickly.

 

I was going to do this as a set of screenshots, but David Hull told me that it was much more powerful as a video, because you can see just how blazingly fast SAP HANA is with Lumira. I hope you enjoy it!

 

Let me know in the comments what you would like to see in Part 3.

 

Update: between the various tables, I have pretty good latitude and longitude data for the NOAA weather stations. However, NOAA did a really bad job of enriching this data and it has Country (FIPS) and US States only. There are 31k total stations, and I'd love to enrich these with global Country/Region/City information. Does anyone know of an efficient and free way of doing this? Please comment below! Thanks!

 

Update: in a conversation with Oliver Rogers, we discussed using HANA XS to enrich latitude and longitude data with Country/Region/City from the Google Reverse Geocoding API. This has a limit of 15k requests a day so we would have to throttle XS whilst it updates the most popular geocodings directly. This could be neat and reusable code for any HANA scenario!

 

Moving HANA out of its comfort zone: Cloud Foundry and SAS

$
0
0

The TechEd in Amsterdam is almost over but there were two announcements made during the TechEd Keynote in Las Vegas that I keep remembering / trying to understand.  The key message at the TechEds focuses on HANA as a “platform” but the two announcements represented a different perspective for me – an evolution of HANA existing in the friendly SAP ecosystem and being deployed / used in more hostile “enemy territory”. 

 

SAP’s increasing focus on start-ups (as evidenced by the SAP Startup Focus program) and external developers (as demonstrated by the presence of SAP at non-traditional developer events such as TechCrunch Disrupt) already demonstrate that SAP is aware of importance of looking outside the boundaries of the SAP ecosystem. Yet the two announcements in question display just how far SAP is willing / able to go to achieve these goals.

 

Cloud Foundry

image001.jpg

[SOURCE]

 

During the Keynote at the Teched in Las Vegas, there was an announcement about a cooperation with Cloud Foundry with the hint that there will be more details at the upcoming TechEd in Bangalore.  The slide above is the only publicly available details about the partnership.  

 

Before diving into the fun stuff / speculations, let me provide a quick description about Cloud Foundry.

Cloud Foundry is an open platform as a service, providing a choice of clouds, developer frameworks and application services. Cloud Foundry makes it faster and easier to build, test, deploy and scale applications. It is an open source project and is available through a variety of private cloud distributions and public cloud instances. [SOURCE]

 

Cloud Foundry is also closely associated with Pivotal– a  spin-off from EMC and VMware.

 

To put it bluntly, Cloud Foundry is a competitor of the HANA Cloud Platform – this fact makes the announcement even more surprising.

 

Without more details, let’s attempt to analyze what the cooperation might contain based on the slide from the keynote.

 

The word “contributing” suggests that SAP will likely join the list of other contributors to the platform (which means signing the Corporate Contributor License Agreement (CLA)). SAP is already contributing to various other open source communities (for example, Eclipse) so by itself the participation in Cloud Foundry isn’t earth-shattering.

 

The more interesting aspect concerns exactly what SAP might contribute to Cloud Foundry.  The slide above provides a clue  - “integration” and “SAP HANA”. My assumption is that the cooperation will focus on the ability to use SAP HANA as a database within this environment. 

 

Cloud Foundry already supports a variety of other databases (MongoDB, MySQL, etc) – indeed, Pivotal’s use of GemFire already provides in-memory database functionality. Furthermore, Pivotal’s recent announcement of in-memory storage combined with Hadoop functionality resembles SAP’s Big Data architecture based on HANA.  Thus, the ability to use HANA in this PaaS would fit well with itsarchitecture. 

 

The fact that productive HANA environments can soon be virtualized makes things even more interesting. VMware vSphere is mentioned specifically for such virtualization scenarios – vSphere can also be used for Cloud Foundry; thus, virtualized HANA instances would appear to be a good fit in this environment.

 

 

SAS Partnership

 

The other announcement of interest at the TechEd in Vegas concerned a partnership with SAS.

SAP AG (NYSE: SAP) and SAS have unveiled a strategic partnership that is expected to advance in-memory data analysis capabilities for businesses across industries. SAP and SAS will partner closely to create a joint technology and product roadmap designed to leverage the SAP HANA® platform and SAS analytics capabilities. By incorporating the in-memory SAP HANA platform into SAS applications and enabling SAS' industry-proven advanced analytics algorithms to run on SAP HANA, decision makers will have the opportunity to leverage the value of real-time data analysis within their existing SAS and SAP HANA environments.

[SOURCE]

This partnership is surprising, because SAS and SAP are competitors in a variety of fields.

SAP and SAS are leaders (and fierce competitors) for in-memory platform technology, advanced analytics, and business applications. [SOURCE]

 

In a similar fashion as in the Cloud Foundry agreement, HANA is being placed in a competitor’s environment.

 

My POV

 

Although the focus of the recent TechEds has been on platform, both announcements mentioned in this blog are more focused on demonstrating that HANA is indeed a mature database.  Everyone knows that HANA is destined to be THE database for SAP products but to see it move into enemy territory and be successful there is even more intriguing and is evidence that HANA adoption is moving into a new broader phase.

 

Note: Although I’m focusing on the database aspects of the announcements in question, there are other interesting angles as well that motivated SAP to participate in both deals. The Cloud Foundry announcement is another example of SAP placing HANA into a different developer ecosystem – distinct from the often incestuous SAP ecosystem – in an attempt to increase its adoption. The use of SAS’ advanced analytics algorithms within HANA supplements existing domain-specific functionality and will enhance its appeal to data scientists involved in Big Data scenarios.

Big Data Webinar - Gaming Apps on SAP HANA

$
0
0

The Big Data webinar series continues this week and will be presented by Jan Teichmann who works for HANA Product Management on HANA in the Cloud as Development Architect.. He is used to “thinking Cloud” having worked for ByD, checking whether a service consumption is Cloud-like with regards to robustness, deployability, immediate-consume-ability, low entrance and exit thresholds for customers etc. Also, working with TREX and HANA for many years, he knows the strengths of combining OLTP and OLAP reporting, using a Programming Model harnessing the computational power of HANA.

 

Title: Big Data - Gaming Apps on SAP HANA

Abstract: This webinar focuses first of all on the concepts and architecture of a wide range of applications that need automated real-time decision making and are based on the SAP HANA Platform. The solution approach incorporates real-time streaming and filtering of a massive amount of unstructured data and shows the value that can be created. The main example used is Precision Gaming.  Secondly, Jan will do a demo of development of a small-scale Android Gaming app which will be created online. He will be show how this can grow into a big data application. This can be a starting point for any developer who would like to add some analytics and real-time capabilities to his / her game.

 

Date: 12 November 2013 10 am CET

 

Link to webinar -see calender invite (attached)

 

Interested in finding out more?

If the webinar inspires you to explore and learn more about Big Data then be sure to check out the document that I maintain as the single source of upcoming and on-demand webinars that are available  

 

Join the Big Data conversation

You can check out the Big Data relevant social resources on the big data website or use the hashtags #SAP #BigData

 

@rukso

Big Data Webinar - How to build Big Data applications for retail analysis using SAP HANA.

$
0
0

Did you get a chance to join the webinar on Gaming Apps using SAP HANA that was presented today? Apologies, for the last-minute change to the webinar link.

 

Tomorrow you will have an opportunity to join another exciting webinar on How to build Big Data applications for retail analysis using SAP HANA by John Appleby @applebyj. John has always been passionate about new technology. He cut his teeth helping global financial services and consumer goods companies build data warehouses to manage their business - especially when they want run faster. These days, he travels weekly across continents, helping clients differentiate themselves using analytics technologies. It has involved him building a team to design the solution, make it work and lead it through to successful completion. He is a strong advocate of in-memory computing and how it is radically changing the face of business. So you will have the opportunity to interact with him and ask him about SAP HANA, or about any other data platform like DB2 BLU, Hadoop or MongoDB for that matter. John is very passionate that giving back to the community reaps long-term rewards and this has shaped the last few years of his career - being a contributor to knowledge sharing websites as an SAP Mentor; and a sometime advisor to Wall Street investors on emerging technologies and those companies bringing to market new innovations. When he is not busy designing in-memory apps, you may find him pounding the pavement to the beat of music in the hilly suburbs of Philadelphia, or traveling the world to meet new people and new cultures.

 

Here is a sneak pre-view of what John will be covering in the session

Picture1.png

 

Title: How to build Big Data applications for retail analysis using SAP HANA

Abstract:

The webinar will provide insight and guidance on how to build Big Data applications for retail analysis using SAP HANA. Demonstration of the architecture and setup of Big Data Apps with SAP HANA, and how to get amazing performance with billions of rows of data. John will also demonstrate two applications, for retail analysis and climate change.

Date: 13 November 2013 5 pm CET/ 8 am PST

Link to webinar:https://www.brighttalk.com/webcast/9727/90205

 

Interested in finding out more?

If the webinar inspires you to explore and learn more about Big Data then be sure to check out the document that I maintain as the single source of upcoming and on-demand webinars that are available. Additionally be sure to read the thought leadership articles on the big data website to discover how harnessing the value of Big Data can help your organization compete more effectively.

 

Join the Big Data conversation

You can check out the Big Data relevant social resources on the big data website or use the hashtags #SAP #BigData

 

@rukso

BIG DATA Platform Capabilities & Benefits

$
0
0

Big Data For Starters

In this blog I am going to explain about Various Big Data Platforms,Capabilities & Benefits.

 

Customer (BIG) Challenges


  • Making sense of the explosion of data: Organizations need the right tools to make sense of the overwhelming amount of data generated by declining hardware costs and complex data sources.
  • Understanding a growing variety of data: Organizations need to analyze both relational and non-relational data. Over 85 percent of data captured is unstructured.
  • Enabling real-time analysis of data: New data sources—such as social media sites like Twitter, Facebook, and LinkedIn—are producing unprecedented volumes of data in real time, which cannot be analyzed effectively with simple batch processing.
  • Achieving simplified deployment and management: Organizations need a streamlined deployment and setup experience that simplifies the complexity of Apache Hadoop. Ideally they would prefer to have fewer installation files that package the required Hadoop-related projects instead of making the choice themselves.
  • Big data presents a number of challenges relating to its complexity.
  • How we can understand and use big data when it comes in an unstructured format, such as text or video.
  • How we can capture the most important data as it happens and deliver that to the right people in real-time.

Building a Big Data Platform

 

As with data warehousing, web stores or any IT platform, an infrastructure for big data has unique requirements. In considering all the components of a big data platform, it is important to remember that the end goal is to easily integrate your big data with your enterprise data to allow you to conduct deep analytics on the combined data set. We can classify various platforms into three aspects with examples for each.

Storage System

Parallel DBMS

No SQL

Handling

Map Reduce

Apache Hive/Pig

Analysis Method

GNU R

Apache Mahout

 

1_BIG_Build.JPG

Source: CRISIL GR&A analysis

 

SAP BigData Platform

SAP has integrated HANA with Hadoop, enabling customers to move data between Hive and Hadoop's Distributed File System and SAP HANA or SAP Sybase IQ server. It has also set up a "big-data" partner council, which will work to provide products that make use of HANA and Hadoop. One of the key partners is Cloudera. SAP wants it to be easy to connect to data, whether it's in SAP software or software from another vendor.

 

2_SAP_BIG.JPG

SAP Big Data Platform – Features

  • Real-time analysis of transactional and analytic data from disparate sources
  • Adaptable, powerful analytic models that expose insight
  • Integration with predictive analysis software and algorithms
  • Native full-text search, graphical search modeling, and user interface toolkit
  • Find valuable and actionable information from their mass amounts of data
  • Accelerate business processes with rapid analysis and reporting
  • Invent new business models and processes
  • Reduce Total cost of ownership (TCO) with less hardware and maintenance

 

SAP Big Data Platform – Benefits

  • Real-time business insight by analyzing business operations as they happen
  • Adaptable, powerful analytic models– Create flexible views that expose analytic information at the speed of thought without assistance from IT
  • Extensive, source-agnostic data access– Add external data to analytic models to incorporate data from across the entire organization
  • Multipurpose, in-memory technology– Instantly explore and analyze all transactional and analytical data in real time from virtually any data source
  • Better decisions more quickly by gaining immediate access to all relevant information
  • Greater analytic flexibility through reduced reliance on IT
  • Dramatically reduced hardware and maintenance costs through a flexible, cost-effective, real-time approach for managing large data volumes
  • Improved planning, forecasting, and financial close processes by employing analytic models that uncover trends and patterns

 

IBM BigData Platform

IBM is unique in having developed an enterprise class big data platform that allows you to address the full spectrum of big data business challenges. The platform blends traditional technologies that are well suited for structured, repeatable tasks together with complementary new technologies that address speed and flexibility and are ideal for adhoc data exploration, discovery and unstructured analysis.

IBM.JPG

The platform blends traditional technologies that are well suited for structured, repeatable tasks together with complementary new technologies that address speed and flexibility and are ideal for adhoc data exploration, discovery and unstructured analysis.

IBM’s integrated big data platform has four core capabilities: Hadoop-based analytics, stream computing, data warehousing, and information integration and governance.

IBM big data platform core capabilities are:

Hadoop-based analytics: Processes and analyzes any data type across commodity server clusters.

Stream Computing: Drives continuous analysis of massive volumes of streaming data with sub-millisecond response times.

Data Warehousing: Delivers deep operational insight with advanced in-database analytics.

Information Integration and Governance: Allows you to understand, cleanse, transform, govern and deliver trusted information to your critical business initiatives.

 

Oracle Big Data Platform

  • Oracle made its big-data appliance available earlier this year — a full rack of 18 Oracle Sun servers with 864GB of main memory; 216 CPU cores; 648TB of raw disk storage; 40Gbps InfiniBand connectivity between nodes and engineered systems; and 10Gbps Ethernet connectivity.
  • The system includes Cloudera's Apache Hadoop distribution and manager software, as well as an Oracle NoSQL database and a distribution of R (an open-source statistical computing and graphics environment).

ORACLE.JPG

Capabilities

  • Massively scalable infrastructure to store and manage big data.
  • Big Data Connectors delivers unprecedented load rates between Big Data Appliance and Oracle Exadata.
  • Cloudera’s Distribution including Apache Hadoop delivers managed and proven Hadoop to the enterprise
  • Cloudera Manager simplifies management of Hadoop
  • Advanced analytics with Oracle R on Hadoop data
  • Oracle NoSQL Database Community Edition pre-installed and configured
  • 648 TB of raw storage
  • InfiniBand Connectivity between nodes and across racks

Benefits

  • Optimized and Complete Big Data Solution
  • Integrated with Oracle Exadata to analyze all your data
  • Risk free installation and quick time to value
  • Single vendor support for your entire big data solution

 

EMC’s Big Data Platform

EMC has centered its big-data technology on technology that it acquired when it bought Greenplum in 2010. It offers a unified analytics platform that deals with web, social, document, mobile machine and multimedia data using Hadoop's MapReduce and HDFS, while ERP, CRM and POS data is put into SQL stores. The data mining, neural nets and statistics analysis is carried out using data from both sets, which is fed in to dashboards.

EMC.JPG

Capabilities

Technical Values

  • Performance - Massively parallel Architecture
  • Load speeds – 10TB/hr
  • Integration with SAS Grid
  • In-database analytics using Java, PL/R, etc
  • Integration with many more BI, Analytical tools,
  • Integration with Hadoop for unstructured data analysis

Operational Values

  • Performance with Minimal Operational Overhead
  • Performance Tuning , Controlling various configuration parameters
  • Backup recovery solution
  • Most robust Disaster Recovery Solution in Industry
  • Best Technical and customer Support Organization backing

 

TERADATA Big Data Platform

Hortonworks and Teradata have partnered to provide a clear path to Big Analytics via stable and reliable Hadoop for the enterprise. The joint approach provides analysts the ability to leverage big data (social media, Web clickstream, call center, and other types of customer interaction data) in their analysis while using  familiar tools.

Tera Data.JPG

Features

  • Mapreduce integrated with and encapsulated in SQL
  • Out-of-the-box analytic library, with over 50 functions, including graphing, path, pattern.
  • Massively parallel processing database for speed-of thought analytics.
  • Enterprise-ready Hadoop solution
  • SQL-based access to Hadoop
  • Integrated, single vendor hardware and software solution is easy to deploy, manage, and troubleshoot
  • Tight integration with teradata Analytical ecosystem
  • Lower skill ramp-up requirements

Benefits

  • New high value analytics for analysts and business users, who can use traditional SQL based tools and skill sets along with an analytic library with more than 50 pre-built mapreduce functions for analysis such as path, customer behavior, marketing,  graph and text analytics
  • Greater insight across the enterprise, thanks to big data storage capabilities and integration with the teradata Analytical ecosystem, allowing business users to explore a wide variety of traditional and new data sources
  • Faster analytics results, with an analytics optimized environment for rapid, on-the-fly data exploration and powerful processing capabilities that help you make the best decisions possible
  • Rapid time to value and low TCO, through easy installation, configuration flexibility, and manageability
  • High performance and availability, with the proactive monitoring features of teradata server management, teradata viewpoint, and teradata vital infrastructure services

 

Few more Blogs on Bigdata

Advanced level - Tech deep dive on BIG DATA Technologies & Applications.

http://scn.sap.com/community/hana-in-memory/blog/2013/04/30/big-data-technologies-applications

 

SAP HANA - Hadoop Integration # 1

 

SAP HANA - Hadoop Integration # 2

 

Also read below Vivek Blogs on Bigdata

Hadoop,Its Importance and Use Cases

Big Data Facts and Its Importance,

 



50 first dates for the HANA Enterprise Cloud: Ending the isolation

$
0
0

The HANA Enterprise Cloud (HEC) is increasing in importance in SAP’s Cloud and HANA strategies. Yet, this emphasis is usually relatively primitive in that it just considers the hosted applications (Business Suite, BW, etc) in isolation rather than embedding them in broader / realistic scenarios in which such applications interact with other entities.

 

I’ve already blogged about the HEC as a generic extension platform and as an environment for Big Data applications but there are a variety of other perspectives that I wanted to examine in more detail.

 

The HEC is like a pretty but shy teenager that desperately needs to get out and meet others. In this vein, I’d like to send the HEC on a few dates


Date 1:  The HANA Cloud Platform (HCP)

 

The HCP is SAP’s PaaS offering and provides an excellent platform to extend other environments including the HANA Enterprise Cloud.  This possibility was also mentioned by Björn Goerke in a recent blog.

 

Yet, there are so many details that are unclear in such scenarios. Usually, HCP applications accessing OnPremise applications (such as CRM, Business Suite, etc) use the Cloud Connector to access such applications. In scenarios where the back-ends are located in the HANA Enterprise Cloud, this scenario would probably look different.

 

image001.jpg

 

This scenario raises some interesting questions:

  • Is the Cloud Connector even necessary? Both environments run in SAP data centers. My assumption is that the internal network is secure. Or are there other security concerns that might prevent such simplification?
  • Who configures the Cloud Connector in the HEC? The customer? SAP?  If SAP performs this configuration, is it part of the existing HEC service offering?
  • What happens when the HEC is being hosted by a partner? Do they have experience in this area?

 

Date 2:  Rapid Deployment Scenarios (RDS)

 

There are a variety of RDS available– there are also a few RDS that focus on supporting migrations to the HEC. My interest, however, is on the normal RDS solutions. What happens when a customer has their Business Suite running on the HEC? Are the same prerequisites necessary? Do partners who offer the RDS solution have access to the HEC instance of their customers? Since the application servers in the HEC are virtualized, are there possibilities to simplify RDS’ by using preconfigured instances?   

 

Date 3:  Fiori

 

Fiori mobile apps are all the rage but no one has really looked at the possibility of a customer using a Business Suite running in the HEC for such apps.   The architecture of Fiori apps has three main components (UI Tools, SAP NW Gateway, SAP Backend) – all of which could easily run in the HEC.  I assume that HEC service offers are in the works to provide such functionality to HEC customers. There is already a Cloud Appliance Library - Virtual Appliance for Fiori but HEC would provide such functionality in a managed service that could exploit the existing on-boarding services to quickly / easily provide Fiori-related content.

 

MyPOV

 

I’m stopping after three dates but you get the picture.  Many of the typical usage scenarios that include OnPremise applications (Gateway usage, relationship to SAP Mobile Platform 3.0, etc) you can also apply to the HEC.  It is only when you more realistically depict such scenarios that the power of HEC is evident. I have the feeling that current marketing campaigns concentrate primarily on the “cloud-angle” rather focusing on broader scenarios which will emerge as customers move to this new environment and reflect on their experience in OnPremise settings.

Big Data webinar - Avoiding Big Data Chaos for Customers and Developers

$
0
0

 

The Big Data webinar series continues next week. I am thrilled that Vijay who has an excellent grasp on technology will be presenting the webinar titled: Avoiding Big Data Chaos for Customers and Developers   Vijay Vijayasankar is a global VP in SAP labs and leads the engineering for big data applications. Prior to joining SAP, Vijay was the global head of SAP forward engineering at IBM. Vijay is an SAP mentor alumnus, and an active blogger at http://andvijaysays.com. Follow him on twitter @vijayasankarv.

 

Title: Avoiding Big Data Chaos for Customers and Developers

Abstract

Big data means many different things depending on who you ask. The collection of technologies that power big data is also evolving at a rapid pace.This makes it hard for customers and developers to figure out how to get value out of big data and avoid chaos. In this session Vijay will cover the common causes for chaos and confusion of big data, and some principles for effectively building big data applications.

 

Date: 18 November 2013 5 pm CET 8 am PST

 

Link to webinarhttps://www.brighttalk.com/webcast/9727/90207

 

 

Interested in finding out more?

If the webinar inspires you to explore and learn more about Big Data then be sure to check out the document that I maintain as the single source of upcoming and on-demand webinars that are available. Additionally be sure to read the thought leadership articles on the big data website to discover how harnessing the value of Big Data can help your organization compete more effectively.

 

Join the Big Data conversation

You can check out the Big Data relevant social resources on the big data website or use the hashtags #SAP #BigData

 

@rukso

Successful Go Live of ERP on HANA EhP7 at itelligence

$
0
0

 

CeBIT – is primarily a trade event where you get to know new prospective customers, presenting yourself and your services and generally network. For us. this year's CeBIT was also the start of an exciting project.

   

We had been asked by SAP if we wanted to use the Suite on HANA as an early adopter in production. We spontaneously agreed and decided to migrate our internal ERP system to SAP HANA. As a long-standing SAP partner, it is very important for us to use new Technologies yourselves (I have given a video interview on this topic). "We use what we sell" - this is the motto we live by, it also creates a great foundation of trust for our customers.

   

We began the implementation a month later, with the project planning:

    

http://blog.itelligence.ag/2013/07/19/sap-hana-worldwide-launch-at-itelligence-episode-1-rough-planning-and-preparation/

    

And ... the first tests:

    

http://blog.itelligence.ag/2013/07/23/worldwide-sap-hana-implementation-at-itelligence-part-2-wave-0/

   

Part of our definition of a successful conversion to SAP HANA was the migration of the hardware and the database. We described this phase as Wave 1 and it is explained in more detail here:

    

http://blog.itelligence.ag/2013/08/05/sap-hana-introduction-at-itelligence-worldwide-episode-3-wave-1/

    

After the hardware upgrade SAP then surprised us with a request, to migrate directly to the new EhP7. Again, we did not discuss this at length, (we were somewhat dumbfounded ) but decided to go for it. And so we were included in the Customer Validation Program for EhP7:

    

http://blog.itelligence.ag/2013/09/04/sap-hana-worldwide-introduction-by-itelligence-part-4-ehp-7-has-popped-up/

 

At the same time, we conducted tests with the DMO (Database Migration Option) tool, which we combined with the Upgrade and Migration of the SAP systems to SAP HANA in one step. The result: we could measure a downtime minimization of about 10 percent.

 

The upgrade to SAP ERP on HANA EhP7 and the database migration were declared successful on 21/10/2013. Here, too, I have formulated a final blog post for a closer look:

    

http://blog.itelligence.ag/2013/11/11/sap-hana-worldwide-introduction-by-itelligence-episode-5-sap-hana-go-live/

 

Today, about four weeks after going into production, the system is very stable and fast. We have not had any escalations or critical problems that have had business impact.

 

Overall, this project is a nice example of the great collaboration between SAP and itelligence as partners. So here again many thanks to all our SAP colleagues who have supported us during the transition.

    

To end on a small personal note: Yes, I know that I have linked here to many different sources with lots of content. But in my view, such a complex issue can not be adequately explained in just one blog post.

Demystifying Big Data

$
0
0

Albert Einstein said: "If you can't explain it simply, you don't understand it well enough"

 

We have several thousand of blogs, articles and recorded speech about Big Data, but do we really understand it? I had the chance talk this year with several people about it and still found people that have not a clear message about it.

 

So, let me try to explain it with a couple of examples:

 

First at all, what is the aim of Big Data? simply: process a large amount of data and provide with enough predictive information to take an executive decision, if possible in a tablet format.

 

That is, nothing more, nothing else, but the problem is to understand that the decisions to be taken need real data, processed in almost real time, based in the appropriated mathematical models and displayed in a understandable format oriented to the mobility of the decision takers, and this my friends is what can be complicated.

 

Let's start with a couple of examples and bust some myths:

 

 

First myth "Big Data results in Big Reports"

 

274759_l_srgb_s_gl.jpg

 

Imagine yourself as a CEO for a large Automotive corporation, you need to decide if you close a plant in a country and open a new one in a more effective location, so you need to understand what are the trends in the candidate locations, compared to the current one, then you need the following data:

 

Political Trends (you don't want to be expropriated of your new plant if the government change) / People development / Transport / Logistics / Banking or Financial system health / Incentives (taxes) / environmental laws / Personal security / facilities and some others.

 

Then you have to compare the last 10 years of this data with your investment and risk analysis and put it together in a comparative chart including all the candidate countries

 

Last but not the least, be honest, no one except "that guy" in controlling will play with a 900MB excel file, so you as CEO need to have this information in your Tablet displayed if possible only in one screen, and you need it during your meeting with shareholders and investors to get support in your decision.

 

The first myth is busted (I love that TV show)

 

But the last example probably make you conclude that Big Data is for large corporations only, and this is another myth, let me give you another example:

http://www.marinemuseum.de/assets/images/100520_Atalanta.jpg

Just again imagine you are a Commanding Officer of a small Warship (15 sailors) deployed into anti-piracy operation close to Somalia`s coast.

 

Suddenly you receive a "Mayday" call for a Liquid Gas cargo vessel, that is followed by a boat preparing a pirate attack. You leave the bridge immediately and move to the Combat Direction Center to lead the operation against this thread.

 

You will need to analyze several data just before give to the executive officer the "Clear to Engage!" order, but this is only one side of the coin, also you will need to get this data ASAP, time is crucial here, you cannot wait till the data is displayed or the ship will be captured.

 

What kind of information are we talking about?

 

You realize that a mistake can make the ship explode (is Liquid Gas) and will not only affect your own security and the cargo ship crew, also a nearly coastal city full of civilians will be impacted by the explosion, causing several casualties so, what to do ?

 

The Combat Direction Center systems will provide you with big data and the mathematical models will process the information to give you the options to take effective military actions before the vessel is boarded by the pirates, and to grant the security of all the ships, people and crew (including your own team)

http://www.defense.gov/dodcmsshare/newsphoto/2011-03/hires_110304-N-SG869-010.jpg

 

This is also Big Data and real time decision support systems, so the second myth is also busted

 

You can adapt the Big Data models to your own business needs and provide to the decision takers with the realistic information; with SAP you have some tools to make use of your Big Data and to display it in the required way.

 

 

SAP HANA

SAP UI5

SAP Fiori

SAP Gateway

SAP Business Intelligence

 

and more ...

 

Hope you find this article interesting and start enjoying the usage of big data to support your decisions.

My PoV: SAP Business Suite on HANA

$
0
0

This detailed PoV is made up of the following sections:

  1. Recommendations
  2. SoH: POV

Recommendation:

  • Weigh the benefits and opportunities against the challenges and risk

Business Suite on HANA provides customers with real opportunities to improve their businesses. However, the significance and value of these benefits for organisations will vary and need to be identified and balanced with the fact that the technology is yet unproven in a production environment (with the exception of a few beta customers).

 

  • Use value identification/discovery workshop (industry, technical expert) to identify sweet-spots

  Each customer should carry out a business value exercise to identify the possible sweet spots. Account teams/industrial teams/SAP/Analytics team should do work beforehand instead of relying on customers to identify.

 

  • Create a HANA roadmap aligned to the organisations application and platform strategy and review periodically
  • Establish whether the organisation is a leading edge adopter (risk taker, leader); a challenger or a follower – based on opportunities identified against risks and costs.

Customers should take the opportunity to evaluate Business Suite on HANA.  Evaluate the possible benefits and opportunities that the real-time data platform (HANA) offers and develop a HANA roadmap to fit into their application and platform strategy.

  1. The roadmap should outline a time frame for adopting HANA in its different forms (side-car, BW, Business Suite) for strategic planning.
  2. Periodic review of the roadmap with the associated benefits should be done as the customer base grows. Time frames could be:
    1. Leading edge adopters: within 12 months
    2. Challenger:  2- 5 years
    3. Follower: >5 years / defer for foreseeable future.
  3. As part of the HANA roadmap/strategy the customer should consider how the data warehouse fits in in-light of the enhanced SoH analytical capability (SAP HANA Analytical Framework and additional analytical apps created on HANA)
  • Understand the cost of ownership: database license costs, investment of cost adoption:  migration, integration etc.

Whilst the business case for SoH should be based around business benefits, there are notable IT benefits however there are additional IT costs too!

  • Organizations interested in SAP HANA DBMS technology for large scale, business-critical deployments should adopt HANA when disaster recovery, high availability and partitioning tools are sufficient to meet current needs

 

SoH: My POV

Embed Real-Time Analytics in the Transactional/Operational Process/System

The ability to perform analytics on transactional data real-time and embed them into processes.

  • No need to extract data from transactional system and aggregate to data warehouse for reporting.
    • Eliminating the data warehouse layer and associated costs (licenses, hardware, software, support, landscape systems) – in some cases
    • Reducing the time taken to access the data for reporting. Since it does not need to be loaded and aggregated to the data warehouse. The data is available for reporting as soon as its created in the transactional system.
  • Embedded intelligence, real-time reporting, prediction, operational-data with social analytics/sentiment analytics etc – all require different types of people.

POV: There is still a case for data warehouse, for example if there was a requirement  to do historical reporting over a dataset extending 10 years, it’s unlikely that a transactional system will hold such data;  and enterprise data warehouse where there are many different source of data.

Business Benefits

  • 24 pre-built scenarios
  • “Imagine” new use cases – the art of the possible
  • Base for new business model
  • Improved performance:

PoV: The technology in the HANA platform can speed-up application processes compared to other database systems. SAP has a roadmap (some developers, others to be developed) of creating 24 businesses processes/scenarios optimised for HANA. These  resource-intensive processes will yield the greatest benefits such as material requirements planning (MRP). This speed will enable the business to interact with the results differently significantly changing the business interaction and opening up new competitive business models.

 

A break away from the traditional SAP mentality that organisations processes had to be adapted to “best-practice” SAP processes is guised under the terms innovation and “imagine”. SAP is providing the technical framework for companies to re-think how they do business and interact with customers.

 

Technical Benefits

  • Technically non-disruptive migration (database migration)
  • Lower total cost of ownership.
  • A simplified DBA experience.
    • Dramatic landscape simplification.
    • More functionality with less code.
    • End of batch programmes

POV: The technology need is being proven in an increasing number of implementations.

In the words of others...

$
0
0

If you are the kind of person who rather enjoys listening to people instead of reading through education material, then you might be interested in this:

http://events.sap.com/teched/en/session/8596

 

Behind this link you find the recording of my SAP TechED session RDP302 - Understanding SAP HANA Database performance, presented by my team lead and colleague Richard Bremer.

Having listened to it, I wish I had been sitting in this session myself a couple of years ago .

 

- Lars

SAP HANA : Points to ponder

$
0
0

HANA is in-memory technology with the provision of moving business logic to database level instead of doing computation at application server (tradition method) level. This got me to ponder over following points:

 

a) In traditional approach business logic computation was done at 'in-memory'; so how is it different when it comes to HANA.

b) As we combining data fetch and processing at single place; will it result in clutter/confusion of development?

c) Row store to column store shift - How will it be with transaction data?

d) In-memory technology usage for transaction systems - Paradigm shift in the way hardware was looked at

e) How HANA can be of help in this highly competitive and fast-moving world to ensure or gain leadership position in the market by being at the edge and sensing the market, customer, and competitors

f) HANA – How it is major shift from traditional approach in terms of overall processing of user requests

g) Big data on HANA (in-memory) is fine as huge amount of data has to be analyzed and processed. But why SAP is/has coming up with ECC on HANA, CRM on HANA, etc…

h) How and what will be the change in ABAP coding with ECC on HANA

i) With calculations moving to HANA; what will be the role of ABAP programmer in performance optimization of the application

 

I did attend session in my organization on HANA overview and ABAP on HANA. This helped to understand how will HANA be pervasive in near future (in organizations having SAP business software as their business management tool) as it comes with number of features.

  • With the options of HANA as secondary database in initial stages of adoption gives IT department of organizations to try out HANA solution without the concern of business activity impact. 
  • In-memory technology helps to improve application performance by major scale
  • Moving the calculation part to in-memory technology with readily available data will be reducing the complexity of the coding from ABAP side
  • Extended application services of HANA(full featured application server, web server, and development environment within the SAP HANA appliance) is major shift as full-fledged applications can be realized on HANA with browser as client(please note no intermediaries  from plain application perspective)

With number of performance optimizations options in ABAP with ABAP on HANA scenario; it will be paradigm shift in the development is looked at or carried out.

 

  I was involved in POC in which tried to compare the performance of SAPUI5 report (0.2 million sales cycle records- I know it is insignificant when it comes to capabilities of HANA) with traditional ABAP Vs. HANA as backend. In case of ABAP as backend, we used ODATA compliant gateway service for supplying data to SAPUI5. In case of HANA, we used XSJS service as data provider to SAPUI5. We saw major performance improvement of HANA based application over normal application.


Exception aggregation modeling with Graphical Calc view

$
0
0

Usual disclaimer:

 

Please note that the following model has been implemented in SPS06 (revision 60 and onwards). I am not sure about the feasibility of this approach on the earlier revisions. The observations on the below model are based on my personal experience and opinion.

 

The business scenario is quite common in the BI reporting. Consider a KPI, required to be calculated at a low level of granularity, but the reporting is required at the higher level of granularity. With the default aggregation behavior of the calc view, the constituents of the Calculated KPI might also be aggregated at the reporting granularity level, resulting in incorrect value for the Calculated KPI.

 

To elaborate more on this, consider the following data model for the Sales Transaction. The transaction data is captured for Product Sold on a given date in a store. The table structure can be defined as follows:


( STORE_ID    nvarchar(10),  --- Store Id

  PROD_ID     nvarchar(10),  --- Product Id

  PROD_CAT    nvarchar(10),  --- Product Category

  PROD_PRICE  decimal(10,2), --- Product Price

  SOLD_QTY    integer,       --- Product Quantity Sold in a Sales transaction

  TRANS_DATE  nvarchar(8),   --- Sales Transaction Date

  STORE_REG   nvarchar(10)   --- Region in which the Store is located

);

 

The business rule could be:

The product price may vary each day, hence the Net Sales need to be calculated for the Product Price on the transaction date and Quantity Sold on the day.

 

*** In ideal scenario, the Net Sales should be calculated at the time of data loading and should be persisted in the data model. This can have positive impact on the performance and is also a recommended modeling option. The following modeling option explained below should be implemented, when the calculation persistence is not feasible due to more complex requirements or different sources / data stores for Product price and Sales Transaction.

 

The sample Sales transaction data is as follows:

table_data.JPG

In the above example, consider the Net Sales in the Sales transaction data with calculation as Product Price * Quantity Sold. The Net sales need to be calculated at the transaction level granularity.

 

The standard Calc view model, based on a standard Analytic view using the transaction table is as follows:

 

normal_model.JPG

Please note that the Multidimensional Reporting property is set to True which adds default aggregation node to the Calc view. The default behavior of all the base KFs is set to SUM. The Calculation of the Net Sales is defined as below hence executed as “Calculate After aggregation” resulting in the following output.

 

Net_Sales_KF.JPG

The output of the query is as follows:

 

query_one_output.JPG

The Product Price is shown for the explanation purpose. The Product price at Product category level does not really make any business sense.

 

The expected correct output for the above query is as follows:

 

Excel_output.JPG

 

To achieve the above output, we need to make the following changes to the model.


  1. Create the model with the Multidimensional property set to False, hence adding the default Projection node to the model.
  2. Add an aggregation node on the Analytic view to define the aggregation behavior.
  3. Set the Keep Flag property for all the attributes which defines the Exception aggregation criteria. In this case, the aggregation is expected to happen at Store, Product and Transaction date irrespective of the reporting granularity. The Keep flag will always add these columns in the group by clause even if they are not selected in the query. Please note that the added columns in aggregation will impact the performance as each column join is added to the resultset.
  4. Define the Calculated KF for Net Sales in the Aggregation node. The calculation remains same as shown in the earlier screenshot.

 

new_model.JPG

keep_flag_settings.JPG

The query executed on this model will result in the required output.

 

query_two_output.JPG

Well, now that works as expected..

 

Performance impact:

 

Please note that this implementation has performance impacts. If we look into the visual execution plans for both the queries, then we can see that the second query includes all the columns in the SELECT statement along with all the columns with Keep flag property set to true.

 

Execution for the first query on ZGCV_PROD_SALES looks something like:

plan_1.JPG

The execution plan shows only those columns required in the SELECT statement.

 

Execution for the second query on ZGVC_PROD_SALES_EXCP_AGGR (with Exception aggregation) looks like:

plan_2_1.jpg

As we can see from the execution plan above, the performance gets impacted due to additional column joins resulting in higher data volume, 20 rows, being passed from one engine to another, as compared to 3 rows in the earlier execution. The generated resultset depends upon the granularity specified with the Keep flag setting.

plan_2_2.jpgAs you can see from the screenshot above, all the columns in the SELECT statement and with Keep flag set to true are used in column joins to generate the result set.

 

As mentioned earlier, the better option is to have such values persisted in the database. In case, due to complex business or technical requirements such persistence is not feasible, then the above option should work fine. But at the cost of some performance impact.

 

All the comments, suggestions, discussions are most welcome.

Big Data webinar - Strategies for implementing your Big Data using an Architecture Approach

$
0
0

 

On Monday we had a great webinar on how to avoid Big Data chaos by Vijay Vijayasankar. If you did not get a chance to join Vijay be sure to check out the recording as Vijay makes the complex topic of Big Dat easy to understand and gives practical examples on the value of using Big Data.


Today we will have another informative topic presented by David Dichmann and Laurie Barker that will help you understand how to understand your current architecture and prepare it for successfully implementing your Big Data projects. David is a Product Manager with the SAP HANA team, specializing in architecture modeling tools.  He manages the technical innovation for modeling and metadata management lines of products including the technical vision and direction of SAP Sybase PowerDesigner, a market leading solution for information and enterprise architecture.  David has over 24 years industry experience in both technical and business roles, working with small, start-up, and established businesses.  David has been published in industry magazines and is a regular speaker at industry events. Laurie  is the Product Marketing Manager for SAP PowerDesigner, as well as a member of the Real-time Data Platform team.  Laurie is a seasoned marketing professional with over 25 years experience marketing for high-tech companies.

 

 

 

 

Title: Big Data webinar - Strategies for implementing your Big Data using an Architecture Approach

 

Abstract

Dealing with Big Data?  Gaining an understanding of your current architecture will allow you to strategize, plan and implement Big Data into your infrastructure. Prepare for business transformation by understanding current state and modeling future state.  During this webcast, SAP PowerDesigner Product Manager will demonstrate how using an architectural approach to Big Data will allow you to plan and integrate with existing systems to make the most of this leading technology.  SAP PowerDesigner, a leader in Gartner’s 2013 Enterprise Architecture Magic Quadrant is helping SAP customers worldwide with transformative technologies like in-memory database, real-time data movement and Big Data.

Date: 20 November 2013 5 pm CET 8 am PST

How to join: Download the calendar invite

 

 

 

 

 

That’s a wrap for the year

As my work for the year draws to a close (YES I will be on Vacation in December) I would like say: Thank you to all the presenters, SAP experts as well as SAP Mentors, who enthusiastically supported the Big Data webinar initiative by sharing their knowledge and providing a wealth of usable and practical information related to Big Data. YOU are the best!  If Big Data is a topic that interests you then be sure to check out the document that I maintain as the single source of the on-demand Big Data webinars that are available. No matter what your level of interest I am sure that you will love the series as much as I had in bringing the Big Data webinar series to the community

 

 

 

 

I am working on an exciting topic for the New Year and hopefully I can share the scope and get you to help me make it a roaring success. Till then that me @rukso signing out 

 

 

 

 

HANA Business Case: de-Mystified

$
0
0

It has been a while since I last blogged here on SCN. Partially I needed to find a my way and also I was looking for additional sources of inspiration. I have met some excellent professionals in the last couple of weeks who are all working on SAP HANA. There has been a lot of buzz about SAP HANA and in-memory computing in general lately and I believe the good news is that if the incumbents are as well jumping on the in-memory train it is a clear signal that this technology will drive fundamental changes and value.

 

I came across two blog posts of Andrew De Rozairo from SAP. His Blog Building The Business Case For Your HANA Investment Illustrates very nicely the 4 main parts which are to be considered when building a business case.

 

 

He started to detail on the Productivity savings and you can expect more updates soon as the series is being completed. The post underlines in a very illustrative way that it is not only the vast performance of SAP HANA in itself that is driving the savings but the possibility that are opened up.

 


NetWeaver Portal 7.4 is available on the HANA Enterprise Cloud – What comes next?

$
0
0

I’ve been watching the HANA Enterprise Cloud (HEC) closely but I can still be surprised by how it evolves. For example, last week I read a blog stating that the SAP NetWeaver Portal 7.4 – based on HANA – would soon be available on the HEC.

 

Previously, more traditional applications (but still HANA-based) had been made available in this environment:

 

 

I’ve blogged about the HEC as an extension platform but I had totally ignored the possibility of the NetWeaver Portal existing in this environment.

 

Note: There is an accompanying explanatory document concerning this new offer which I find quite good and which contains an useful comparison of NetWeaver Portal 7.4 (OnPremise), NetWeaver Portal 7.4 (HEC) and HANA Cloud Portal.

 

In this blog, I’d like to focus more on why the NetWeaver Portal is even being offered in this environment. The document mentioned above contains the following description.

 

[HEC] has been expanded beyond the “pure” HANA scenarios to enable NetWeaver scenarios which run on HANA. HEC is Ideal for mission-critical applications such as SAP Business Suite and SAP NetWeaver Business Warehouse scenarios, both of which have strong relationship to the SAP NetWeaver Portal. For new customers looking to deploy portal or for existing NetWeaver Portal customers with a “cloud first” / “cloud only” policy, this is a highly recommended option. From a feature perspective, this is the same as the on premises SAP NetWeaver Portal 7.4. So the decision criteria in this case between the two offerings is based on managed vs. on premises (landscape) considerations only. (Emphasis: mine) [SOURCE]

 

For me, “a strong relationship” implies an extension of the more traditional business applications. Although it is theoretically possible, a scenario with an isolated portal instance without the accompanying HEC-based applications with which to integrate is probably suboptimal.  For users evaluating the new HANA-based 7.4 NetWeaver Portal, this isolated instance might be an option but having multiple applications running in HEC – perhaps sharing HANA resources / databases- is a more relevant scenario that more optimally uses the advantages provided in the environment.

 

What NetWeaver application comes next?

Once you open the door for NetWeaver applications in the HEC, what application might be the next choice to appear in this environment?

 

My choice is NetWeaver Process Integration (PI). Although it is not official announced as being ported to HANA, there appears to be work being done in this area. Similar to the NetWeaver Portal, PI could also act as an extension platform for the existing applications in this environment.  Although there might appear to be a competition between the HANA Cloud Integration offering and OnPremise PI installations with HEC-based PI installations, I think that similar arguments exist to distinguish distinct PI-related offers as have been proposed for Portal-related solutions.

 

Can HEC partners support such new applications?

There are a variety of HEC partners right now  - these partners usually provide Business Suite / BW support but as the list of available applications in the HEC expands and includes more exotic applications then such partners might not have the expertise to support such applications. This is especially true for HEC partners who have more of an IaaS focus. In such usage scenarios, it is recommended that customers ask their HEC partners if the relevant experience is available.

BIG DATA Cartoons & Pictures_2

Viewing all 927 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>