Quantcast
Channel: SAP HANA and In-Memory Computing
Viewing all 927 articles
Browse latest View live

Join the Big Data Webinar - Big Data Maturity Model

$
0
0

The Big Data webinar series continues this week and will focus on how you can assess where your organization is on its Big Data journey and where to get most value from Big Data initiatives

 

The Big Data Maturity Model webinar will be presented by Imran P. Siddiqi  who is Senior Principal of Value Engineering at SAP America, where he advises CIOs and CFOs on business value creation from Big Data. Prior to joining SAP he was at CEB, a best practices research and analysis firm based in Arlington, VA, where he held multiple roles including Senior Director of Strategic Marketing, Chief of Staff to the Chairman and CEO, and Senior Director of Research and Content Delivery at the CFO Executive Board. Prior to CEB, Imran held roles in strategy consulting, including at Bain & Company and at Kaiser Associates. Mr. Siddiqi started his career in Finance & Planning at Engro Chemical Pakistan Ltd., where he served as Business Analyst and subsequently as Treasurer. In addition, for the past 10 years Imran has served on the Board of Directors of a national non-profit organization, a local community organization and a public charter school in Washington DC. Imran’s twitter handle is  @ImranP

 

Title:  Big Data Maturity Model

When: Sep 17 2013 7:30 am PST /4.30 pm CET

Abstract:

Let’s face it, the promise of Big Data has not yet lived up to the expectations most of us have had of it. For some, this topic is seen as a series of challenges or problems to be overcome, rather than a set of opportunities to take advantage of. A lot of that is because most BI practitioners have been inundated with tools and technologies. While understanding the technology aspects of big data is important, most organizations cannot pinpoint where they are on their Big Data journey, or where they can go next for the most value. Yet there are many organizations that are finding tremendous ROI already. What can we learn from them? This session will teach a Maturity Model and Architecture framework for pinpointing where you currently are, and how you can progress to get Big Value from Big Data.

 

Link: http://www.brighttalk.com/webcast/9727/87173

 

 

Use the hashtag #SAP #BigData to follow the conversation on twitter or share your impressions. 

 

 

Follow me on twitter @rukso


Quick note on Plan Visualization in NetWeaver

$
0
0

Ok, just a very quick note today with loads of screen shots and only a few lines of text.

The core statement is (for the ones with little time at their avail) is:

 

With a current BASIS SP (my test system is on SP 9) you can create Plan Visualization from within NetWeaver and display it later in SAP HANA Studio.

 

And here's how , but first something else I really like:

With NetWeaver 7.3 there is now the option to store stack traces together with the SQL calls in the SQL Performance trace (ST05).

Why is this really useful?

Think of well structured programs that encapsulate all DB access into a nice DB access ABAP class, like we have it with SAP BW.

If you review a SQL trace of such a program you may easily find, which SQL statement runs the longest, but finding the ABAP application logic that actually triggered it could become tricky.

 

Every time you click on 'Display Call Position in ABAP programs' you will always only end up in the DB access class.

Typically you now would have to set a break point in this class and wait until "your" statement comes along... not very efficient.

 

Today, you just activate the stack collection in the 'Performance trace' menu of ST05 like this:

st05_menu.png

After that, you find the stack trace collection is active:

st05_stack_active.png

Now, run the trace as usually and display it:

st05_display_call_stack.png

Select the trace line you want to know the call stack for and click on the 'stack' icon marked in the picture above.

Then you'll get a dialog window, containing the ABAP call stack that was active when this specific line of the trace had been written out:

st05_call_stack.png

If this wouldn't be good enough, you can even double-click on the stack level you are interested in to navigate to the ABAP source:

ABAP source.png

 

Pretty cool, isn't it?

 

Ok, but now back to the Plan Visualization...

Back in the trace list, you have to click on 'EXPLAIN...' icon to display the explain plan for the statement of interest:

st05_explain_planviz.png

 

Clicking onto the obscure 'Execution Trace' button, will trigger the creation of a XML trace file, that contains all information required for the Plan Visualization function. In fact it's the same file format that SAP HANA studio uses to store PlanViz files in the file system.

 

Once finished you get a message box:

st05_planviz_file_ready.png

 

As the file name 'xml.plv' indicates, it really is just a text file containing XML tags:

st05_planviz_file_xml.png

Visualize it!

 

Now, to actually get the graphical output, all you have to to is to open a SAP HANA studio and to drag and drop the 'xml.plv' file into it.

Alternatively you can use the 'File' -> 'File open...' option.

 

st05_final_planviz.png

You don't even have to be connected to the system to get the graphic output, as all information are in the XML file.

This means: no problems with permissions on DB level, no system access required.

 

Whatever you say, I like it!

But I'm pretty sure that you will agree...

There you go, now you know!

Thanks for taking your time again.

 

 

Cheers, Lars

Process Intelligence - a new perspective for BI: free webinar on Sep 24th, 2013

$
0
0

I'm pleased to announce that CubeServ will present a new webinar - in German language - on Sep 24th.

 

Topic will be "Process Intelligence - a new perspective for BI".

 

Webinar details

 

On the one hand, Process Intelligence means the integration of BI into your business processes.

Additionally, it comprises both measurement and analytics of business processes with the purpose to optimize them continiously and systematically.

 

By means of today`s Technology - with and even without HANA -, business processes can be combined with BI aspects much better than in the past.

 

In the webinar I will demonstrate how you can reach Quick Wins in the context of process optimization projects or process excellence initiatives.

 

When using Process Intelligence approaches, you can reach a new level of BI!

 

Agenda:

  • Process Intelligence: an overview
  • Process Observer:
    • Monitoring of processes in the SAP Business Suite
    • Alerting functionality
  • Measurement and analytics of business processes:
    • Operational Cockpits: modern HTML5 UIs for all process participants in order to ease their daily work
    • Process Performance Dashboards: process-oriented BI
    • CubeServ BI Content „Process Analytics“: one data model in SAP BW for all process analytics requirements (e.g. from Process Observer, BPM, Business Workflow, non-SAP)
  • Outlook:
    • SAP Operational Process Intelligence, powered by HANA
    • Process Mining: make your running processes visible!
  • Conclusion and CubeServ BPM offerings

 

Registration

 

Registration is free at http://www.cubeserv.com/event-details/events/webinar-process-intelligence-eine-neue-perspektive-fuer-bi.html .

 

I'm looking forward to your participation.

 

For further questions, you can contact me directly via sebastian.zick@cubeserv.com or via LinkedIn.

High Availability and Disaster Recovery with SAP HANA - Replay, Q&A Available

$
0
0

950x450.jpg

Replay of "High Availability and Disaster Recovery with SAP HANA", one of the Hands on HANA Webcast Series,is now available on-demand.

 

Go to this page to watch the replay.

 

What is the minimum bandwidth requirement for HANA 1.0 SPS06? How do we get some estimation based on the existing redo log size?

For SAP HANA System Replication the bandwidth depends on the business load on the primary system of the customer. Please check your classical database and double the log amount as a buffer and use this per time unit to be transferred to the other side. This is then the minimum transfer rate. On top you will need further bandwidth for peaks from the delta data transfer happening every 10 minutes. This depends also on the business load and can vary accordingly. Therefore we don’t simply want to request a 10 GBit dark fiber line between data centers, but keep the cost as effective as possible according to effective business load.

 

Can you spend a few min on the cost associated with HANA for customers? Including HANA License, hardware, and project implementation (consulting) costs.

This is one of those ”it depends” answers. It depends on the customer’s use cases, project scope, use of rapid deployment solutions, data sizing, etc...

 

What is SAP licensing cost for HANA?

HANA licensing depends on type of HANA license (Enterprise, Runtime, etc...) and number of blocks purchased.

 

There’s a delta cost for HANA compared to for e.g. a non-HANA SAP BW implementation. Knowing costs associated will be critical when having initial discussions with customers about HANA as an option.

Yes, that is true. This is why we highlighted on one of the first slides to engage all stakeholders, early and often.

 

What is the recommended network bandwidth speed between HANA appliance hardware and SAP Applications server (R3)?

For SAP HANA System Replication the bandwidth depends on the business load on the primary system of the customer. Please check your classical database and double the log amount as a buffer and use this per time unit to be transferred to the other side. This is then the minimum transfer rate. On top you will need further bandwidth for peaks from the delta data transfer happening every 10 minutes. This depends also on the business load and can vary accordingly. Therefore we don’t simply want to request a 10 GBit dark fiber line between data centers, but keep the cost as effective as possible according to effective business load.

 

Is this feature supported in SoH (Suite on HANA)?

Scale-out for SoH is currently in ramp-up with SAP, but a single node worker node can be combined with a standby node to provide such High Availability feature for a single node SAP HANA configuration. This can be expanded to DR.

 

What happens to the data residing in memory when the node fails?

The memory is rebuilt during restart of the standby who takes over the identity of the failed host. The persistency offers all necessary elements to rebuild all committed data here.

 

How long does the failover take?

There are two stages: 1st, until SAP HANA does recognize a failed node (usually there are 3 trials with a 30s timeout in between, i.e. about 1.5min until the failover will be initiated) and then 2nd the reload of the data: in case of a slave failure, just the columns that are required to respond to outstanding queries have to be loaded - this is a matter of seconds. If a master node failed (in case of a BW workload), this might take a few minutes to reload the complete row store. But overall it will be a process of minutes, fully automatic.

 

I agree that HA is automated, but DR is NOT. Am I right?

Different choices: yes, HA is fully automatic. DR can be set up with a stretched HA scenario (1 worker on primary side, 1 standby on DR side) - also fully automatic. Synchronous Replication for DR, yes, that will have to be tailored to your needs, but can be automated.

 

What does Quorum node do?

Avoid split brain situations. Imagine the network between the two nodes is broken: will both sides fire responses and take action? No, only the node still connected to the quorum has majority and will stay functional. The other node will stop working.

 

What software is running on the Quorum node on slide 11? Is it also running HANA? Are the hardware requirements similar to the 2 main nodes?

No, it is just a quorum for the file system to make sure only one version is active. During normal operations, there is a local version and a replicated copy. If one side (only the storage or the whole node) does fail, only one version will stay available. The overall system remains available. There is just the operating system and the GPFS file system running there - per default. But you can add other functionality to that node as well: backup, monitoring etc.

 

Is it mandatory to have Secondary node memory is equal to Primary? Can I replicate from multi node to single node?

No, you would need the same capacity and worker node topology on both sides.

 

What is performance on failover when it is scripted? Do you have experience with this?

With SAP HANA System Replication takeover times are in the range of 2 to 5 minutes if the secondary is preloaded with data and 10-20 minutes if not. With SAP HANA Storage Replication we have similar times with 10-20 minutes because some data parts on the secondary has to be loaded completely from the persistent disks and log has to be rolled forward.

 

Can SAP HANA keep a copy of the PRD image/snapshot in the DR site before we perform the DR drill so that we can reload the old image to perform the synchronization once the DR drill is over. This will help us to shorten re-sync time and minimize bandwidth.
As long as you do not restart the DR side, the PRD replica will stay there. As soon as you connect your app servers to that side, start HANA on that side, and start working on that, the data will be in production and ready to be changed. You can certainly run a snapshot function there to maintain a specific set of data.

 

Is the node size an indicator of the primary data segment size? Or does it include HDD & Log replica?

The node size is the amount of physical memory provided by that server. You would need to run a sizing of your application with SAP to determine how much row store and column store would be required and for what workload (BI, BW, BS).

 

This cluster has 4 nodes and one 1 standby, does that mean for every 4 nodes we have a standby? Is there a best practice for bigger clusters?

No, you could go with 15 workers and one standby, or 55 workers and 1 standby as validated with SAP today. It might be beneficial to define 2 standby nodes for planned rolling upgrades.

 

What is the solution from IBM for HA with Low network speed?

HA within a data center will be provided with internal 10GbE switches so local high bandwidth is guaranteed. For the DR scenario with connections to a remote data center, this will depend on the overall capacity (mind the initial load) and workload (change frequency…, etc.).

 

Concerning warm standby: will the IP address from primary HANA be transferred to secondary HANA

automatically in case of failover?

That can be made part of the overall automation process. Out of the box it is a manual process.

 

Can you please provide more details on how database kernel transfer the data? (log replication?)

The log is written in parallel to the remote as to the local site. Here we work similar to other known shadow database solution on the market. As a difference to current shadow solutions, SAP HANA needs still a delta data transfer on top of the log transfer as a current compromise. We hope to get rid of this delta data mid to end of next year (2014). The delta data amount is evaluated from our internal shadow memory management we use to create internal savepoints (similar to filer snapshots) and have with that a delta process on database page level.

 

In a Synchronous replication scenario, what happens when transactions are written to primary node when the DR disk storage system is down? Will the transactions wait?

In case of GPFS replication, yes, the transactions will wait and will resume upon reestablishing the connection.

 

Is Distributed Datacenter Scale-out Async solution supported at HANA application level (SPS06) or at GPFS level?

Currently it is support with system replication (as with SPS06). For GPFS we are working to get an asynchronous approach validated with SAP.

 

How the DR & synchronization works b/w the Datacenters when Datacenter 2 is not identical or similar as DC1 in terms of available nodes and partition data? Or is it mandate that both DCs have to be identical and similar in terms of hardware, nodes…, etc.?

The capacity and worker node topology has to be maintained on both sides, i.e. you could have 10 worker and 2 standby nodes on the primary side and only 10 worker nodes on the DR side. But reducing the number of worker nodes on the DR side is not supported today.

 

How to size the Bandwidth between the Data Centers using Synch and Asynch modes?

This highly depends on your RPO requirements: the longer an outage and data loss you can accept, the lower the bandwidth might be. A higher bandwidth certainly reduces the RPO in an asynch scenario.

 

Is there any backup concept like incremental or full for SAP HANA?

There are two fully certified solutions for HANA backup using BACKINT: IBM Tivoli Storage Manager and Symantec NetBackup.

 

Is the SAP Internal System running on IBM?

Sure! Please watch the SAPPHIRE keynotes: HANA Enterprise Cloud would not be there without IBM, CRM is running on 6TB nodes, and we’re also working on SAP IT’s ERP system (just to repeat what Vishal Sikka, SAP CTO mentioned there).

 

When is planned to release scale-out systems for suite on Hana?

This is currently in ramp-up. Please contact SAP to get enrolled.

 

Is there a recommendation about distance/latency recommended to implement Disaster Recovery between Site 1 and Site 2 for Synchronous and Asynchronous replication?

There are upfront measures: if the one way network latency is below 350us, synchronous replication will work well. Finally, the fastest benchmark will have to be run to determine whether network optimization will be required from a bandwidth/latency perspective.

 

Is DB consistency maintained in case of both synchronous & asynchronous scale-out multi-node scenario?

In the sense of the word, the RPO with a synchronous approach equals to zero. So, yes, it does stay consistent all the time. An asynchronous approach might lose some packets in flight. Depending on the bandwidth and latency, you can reduce the RPO to the minimum level.

 

Is there a recommended # of standby servers depending on your cluster size / t =shirt size?

Up to 56 nodes has been validated with SAP. 1 standby is the least, 2 standby nodes are definitely recommended to also cover planned outages (e.g. rolling upgrades) and more standby nodes can be configured to provide HA against multiple node fails. Please keep in mind that the failover has to be finished until protection against another node fail will be resumed.

 

If you have a very big replication environment, I believe that in the infrastructure side its good. But if you have a logical corruption, you will have a lot of work to do (restore, replicate again…, etc.), am I right?

Well, from an IBM perspective this is described in the Operations Guide. All kind of scenarios are taken care of there. Whether you loose infrastructure components or just data. Please get in touch if you’d like to elaborate further: rettig@de.ibm.com.

 

We are regularly facing issue of HANA slow down and HANA database crash. Please address those issues. Our developement works get happered due to this.

Are there OSS tickets that you can refer to? Certainly this is annoying and should not happen. I’d like to understand the specifics on why does that happen and would certainly work with SAP and my team to provide assistance herewith: rettig@de.ibm.com.

 

Can I use System Replication for HA AND DR? Replicate tp a 3rd system?

Yes - with the IBM solution.

 

What HANA training SAP courses and links should we access to actively get involved with HANA?

We have several resources on sappartneredge.com and saphana.com.

Meet ASUG TechEd Speaker Tomas Krojzl, a SAP HANA Distinguished Engineer

$
0
0

1asugteched.jpg

 

I am pleased to introduce ASUG TechEd speaker Tomas Krojzl , of the SAP Mentors program and the SAP HANA Distinguished Engineer Program.

 

1fi.jpg

His session is ITM225 SAP HANA – IBM GPFS: Architecture, Concepts, and Best Practices

 

Q: Please tell us about yourself

 

 

A: Currently I am SAP HANA Specialist in IBM Czech Republic. I started in IT industry in year 2000 as freelancer and after switching several companies I finally joined IBM in 2005 where I work until now. At the beginning I did not have any specialization but once I saw SAP NetWeaver I instantly fell in love with the technology and became SAP NetWeaver Basis SME until 2011 when SAP HANA was released to the market. I got unique opportunity to get my hands on this amazing piece of technology just few weeks after it was out and found my new hobby.

 

2fig.jpg

My focus within SAP HANA world is changing over the time and by requirements. I tend to deal with various subjects however I consider my "home" area to be SAP HANA Architecture, Implementation and Operation. Since I am working exclusively with IBM hardware - I am specialized on solutions for SAP HANA delivered by IBM.

 

 

I am also honored to be SAP Mentor and SAP HANA Distinguished Engineer which are among biggest recognitions that I've received over my career.

 

 

Q: What are some of your hobbies?

 

 

A: SAP HANA technology is in high demand therefore I do not have a lot of spare time for hobbies. When not busy dealing with SAP HANA I am trying to spend as much time as I can with my wife and two kids. Sometimes I find some time to read a book or watch some movie.

 

3fig.jpg

Q: Tell us about your SAP TechEd experience and session

 

 

A: SAP TechEd is fantastic event and I am happy that SAP is organizing these knowledge-sharing events. For the first time I will be presenting my own session at such huge conference (big thanks to ASUG for giving me chance to do this). The focus of the session will be of course SAP HANA - but from angle that is not very common. Most of the sessions are focused on database part of SAP HANA appliance however there are other layers below that are equally important. This subject it too wide to be completely covered in single session so I decided to focus on IBM GPFS file system which is in the core of IBM Solution for SAP HANA - name of my session is describing its content - "SAP HANA - IBM GPFS: Architecture, Concepts, and Best Practices".

 

 

I would like to invite everyone who is interested to learn about SAP HANA from different perspective especially those that are running or will be running solution from IBM - you will learn important information how this solution is designed, how it works and how to properly operate it and how to easily deal with potential problems you might encounter.

 

 

I will be running around SAP TechEd in SAP Mentor T-shirt proudly wearing number 122 - if you will see me and would like to discuss about SAP HANA (or any subject where I can help) please do not hesitate to stop me - I will be more then happy to engage in discussion. See you at SAP TechEd 2013 Las Vegas...

 

 

Editor's note:

Tomas did not share this picture but I found it on Twitter.  He was one of the few selected and honored by IBM CEO Ginny Rometty "Best of IBM 2013" - a tremendous honor:

lastGinny.png

 

All photos provided by Tomas.

 

So add SAP HANA – IBM GPFS: Architecture, Concepts, and Best Practices to your SAP TechEd Las Vegas agenda today.  Tomas is also a big Star Trek fan, as I found out last May; we can quote the characters.  I thank Tomas for submitting his abstract and sharing his story.

 

Related Links:

ASUG TechEd Meet the Speaker Tamas Szirtes - Using a Mobile Portal to Monitor SAP HANA

Meet ASUG TechEd Speaker Emil - Kellogg's Data How to Deliver One Version of the Truth

Beer and Analytics by Alexandre Papagiannidis Rivet

Workflow Approval Anywhere, Anytime - Meet the Speaker Graham Robinson - ASUG SAP TechEd session

Meet the ASUG TechEd Speaker Dennis Scoville - Empowered Self-Service BI with SAP HANA and SAP Lumira

ASUG TechEd Pre-Conference Sessions

ASUG TechEd Meet the Speaker Doni Kessel & Learn BW to HANA Migration

SAP ERP to HANA Migration - Meet ASUG TechEd Speaker Moses Nicholson

Meet ASUG SAP TechEd Speaker Sandy Speizer - Keeping BW in Tip-Top Shape for SAP HANA

VIDEO: Seamlessly Migrate SAP Business Suite to SAP HANA

$
0
0

SAP HANA makes real-time transactions and analytics a healthy reality. In this video, we explain how Savvis, a CenturyLink company, hosting enables flexibility and provide services that makes it simple to migrate SAP Business Suite to SAP HANA while preserving your SAP investments, supercharging your business and simplifying your IT landscape.

 

 

We're happy to chat about your unique situation. If you have any questions about SAP HANA running on Savvis infrastructure, please email us at sap@savvis.com.

Searching for certified HANA Enterprise Cloud partners

$
0
0

I’ve been doing further research on the HANA Enterprise Cloud - in particular, the broader ecosystem surrounding the new offering - and I keep stumbling across references to certified HANA Enterprise Cloud partners.

 

image001.jpg

[SOURCE]

 

I imagined myself as a customer – excited / intrigued by this new solution - looking for such partners.  I decided to document my attempt – which was largely unsuccessful – and provide suggestions on how to improve such searches.

 

Note: This blog doesn’t delve into the problem concerning the meaning of a “HANA Enterprise Cloud certification”, because I failed to find a definition of it. Expertise with the underlying HANA technology? Support of certain SLAs? Experience with a certain consulting methodology? I’d expect something similar to that available for SAP-certified Cloud Services Providers. Here is a short description of this certification:

 

 

To ensure ongoing, high standards for management of SAP solutions, certified providers undergo an extensive assessment by SAP to validate the operational integration of their cloud services with supported SAP applications. The audit also includes technical reviews of physical and logical security processes supporting those cloud services. This entire process undergoes a recertification process every two years. Currently, the certification is focused on providers offering private cloud services for managing SAP solutions. [SOURCE]

 

 

This information might be available to potential partners in the protected Partner Edge site but as a customer, it is important to understand what a certification includes and what is not included.

 

The Search

 

When the HANA Enterprise Cloud was announced in May, there was an associated announcement about partners: 

 

SAP today announced the SAP HANA Enterprise Cloud partner program, which will aim to provide participants access to enablement, best practices and other resources to leverage the recently announced SAP HANA Enterprise Cloud service [SOURCE]

 

There was a list of companies that were initial pilot partners and that agreed to participate in the SAP HANA Enterprise Cloud partner program - each company provided a supportive quote.

 

This initial list was created for the HEC announcement in May. Since its creation, there has been very little publicly available information about this partner program.

 

On my search for certified HEC partners, I decided that this list would be a good starting point My first problem was the question:  these were participants in the partner program but were they were certified? The relationship between this partner program and certification wasn’t described any where.

 

I tracked down further HEC-related information about some of these partners.

 

Partner

Comment

Itelligence

This is the only press release from a partner that specifically mentions SAP-Certified Provider of Hosting Services for SAP HANA® Enterprise Cloud

Savvis

This press release – which describes Savvis’ subscription-based services for SAP HANA - sure sounds like HEC but it predates the HEC announcement.

Virtustream (General cloud, HANA Enterprise Cloud)

These two different press releases show the distinction between broader Cloud Hosting and HEC-associated hosting. 

 

NoteThese partners are probably certified but I was unable to find any publicly available material supporting this assertion.

 

Note: There might be other partners which are HEC certified service providers but I failed to find them in my searches.


A suggestion to solve this problem

 

The HEC is critical in SAP’s cloud and HANA GTM strategy and will be successful only if the partner ecosystem energetically embraces it.  If customers are unable to find such certified partners, a critical prerequisite for this mass adoption is missing.

 

A perfect opportunity exists to solve this gap via the existing Certified Outsourcing Partners list

 

image002.jpg

 

In the current implementation of this list, partners can be identified as being certified in various areas:

image003.jpgimage004.jpg

This site must be adopted so that HEC-certified partners can be easily found.  This would mean adding “HANA Enterprise Cloud” to the search criterion. Furthermore, certified partners would be identified as such:

image001.jpg

This change should be easy to implement and would allow customers to find such partners quickly

 

SAP should have an interest to move in this direction but I would expect those partners that are already HEC-certified to be clamoring for such changes.  As more and more companies move towards HEC certification, those early adopters still have a window of opportunity to gain and solidify market share before the market becomes more crowded.

HANA + Hadoop in the Cloud: The role of the HANA Enterprise Cloud in SAP’s Big Data strategy

$
0
0

There has been a noticeable increase in interest in Big Data at SAP (Big Data bus, etc) in recent months.

 

What is still unclear, however, is the relationship between SAP’s Cloud and its Big Data strategies.

 

In a recent Big Data-related blog, Vijay Vijayasankar makes a passing reference to how these two strategies relate to one another:

And we will make it easy to use - easy to administer, easy to consume, easy to extend and so on. You choose the deployment model that is right for you - keep it inhouse, or move it to Hana Enterprise Cloud. [SOURCE]

 

I asked Vijay on Twitter for confirmation that the HANA Enterprise Cloud is one deployment option for such BigData solutions and he responded:

very specifically, platform for sap big data solutions - that explicitly have Hana

 

With this as a starting point, I started looking for other background material. Swen Conrad - SAP HANA Marketing – also references this scenario:

 

Running mission critical applications such as SAP Business Suite, SAP Business Warehouse and several big data applications delivered as a managed cloud service. These services help customers assess, migrate and run rich applications with cloud simplicity [SOURCE]

 

In theory, it appeared that there was some relationship but I needed to look at some concrete scenarios.

 

With this in mind, I examined some of the existing SAP Big Data solutions – one of which is Demand Signal Management (DSiM):

SAP Demand Signal Management, which is powered by SAP HANA, contains a consistent, centralized In-Memory database that stores large volumes of data such as internal master and transactional data (e.g. shipments), external POS data and market research data, etc. This is combined with a framework that ensures the integration, cleansing and harmonization of the data during upload in order to achieve high qualitative results and a common data base for reporting and further processing of the data. [SOURCE]

 

Trying to stump Vijay, I asked about the viability of this product on HEC. His response:

 

absolutely viable for HEC - it sits on top of BWoH which HEC supports

 

Satisfied with this answer, I moved on to other Big Data topics.

 

HANA + Hadoop

 

These activities included a blog about a series of reseller agreements from SAP regarding Hadoop. Part of that press release concerned a set of new Big Data applications from SAP.

 

SAP Demand Signal Management is the first in a series of big data-enabled applications SAP intends to release before the end of 2013, including the SAP Fraud Management analytic application as well as the SAP Customer Engagement Intelligence solution, which includes the SAP Audience Discovery and Targeting, SAP Customer Value Intelligence, SAP Social Contact Intelligence and SAP Account Intelligence analytic applications. By deploying big data-enabled applications, enterprises can get to repeatable, measurable results much faster by infusing insights directly into day-to-day operations

 

What was interesting about these new applications, however, wasn’t mentioned in the official press release:

SAP has started rolling out shrink-wrapped applications designed to run on the combination of Hadoop and HANA. The first application, called SAP Demand Signal Management, is designed to help manufactures capture and analyze large volumes of "downstream" demand signals, including retail point of sale (POS) data, consumer sentiment data, and market research data.

 

SAP has plans to deliver two additional shrink-wrapped Hadoop-HANA apps before the end of 2013, including the SAP Fraud Management analytic application and SAP Customer Engagement Intelligence solution. [SOURCE]

These were applications from SAP that would be based on both HANA and Hadoop. This intention made the reseller agreements much more understandable.

 

HANA + Hadoop in the HANA Enterprise Cloud

 

I remembered Vijay’s previous tweet about DSiM and I started wondering about the use of Hadoop in such HEC-based Big Data applications.

 

Swen Conrad had referred to this possibility in another context.

Another use case example, according to Conrad, will be a hybrid (on-premise/HANA cloud) environment for teal-time analytics and big data projects.

 

“It will probably be too costly to perform big data projects entirely in the cloud with SAP HANA, as our pricing will be based on the amount of data you have in-memory. But, we can integrate HANA from the cloud with Hadoop, so companies can combine the two in a hybrid architecture,” he told IDN. In this approach, a company would collect and filter its data using Hadoop and once it has identified the meaningful datasets, put those into HANA. “In that way, HANA can provide you with real-time data results, rather than waiting for hours and hours, at a very reasonable cost.” [SOURCE]

From my understanding, Swen’s portrayal was that Hadoop would exist somewhere else (OnPremise?) rather than in HEC.

 

I recalled another case of a SAP customer – the Globe and Mail newspaper - using Hadoop in a cloud-based scenario that was based on HANA One on AWS.

 

I was curious as to whether this solution – although based on another cloud provider - would work on HEC and starting bugging Vijay again.

image001.jpg

 

Boom - Hadoop + HANA running on the HANA Enterprise Cloud.

 

I let that news settle for a few minutes and started to think about the repercussions of this design pattern.   I made a quick drawing to depict the potential impact of this functionality.

image002.jpg

My POV:

 

  • Although Vijay made his Hadoop-related comment referring to the possibility of the Mail and Globe solution running on HEC, it is possible to imagine one of the newly announced HANA+Hadoop Big Data applications such as DSiM running on HEC.
  • As the example of the Globe and Mail demonstrates, cloud-based HANA + Hadoop Big Data applications are technically possible. The presence of such applications, however, in the HEC is a different story inasmuch as they must be understood in the context of the HEC as a Managed Service with its associated characteristics (SLAs, maintenance / support, etc).
  • Although Hadoop + HANA Big Data solutions on HEC might work with customers still using an OnPremise BusinessSuite on HANA (it might also work for OnPremise Business Suites not running on HANA), they would be a perfect fit for those customers with a HEC-based BusinessSuite on HANA.
  • The usual manner for HANA apps to access Hadoop is via Smart Data Access (SDA) so there is no direct access to Hadoop from the Big Data applications.
  • HANA is a prerequisite for an application to run on the HANA Enterprise Cloud. Thus, Big Data applications which are only based on Hadoop probably will not be hosted in the HEC.
  • There are a variety of other HEC certified partners. The ability of such partners to provide HANA + Hadoop Big Data applications may be limited in that most partners won’t have reseller agreements with various Hadoop distributions. This distinction may provide the SAP-hosted HEC with certain competitive advantages in the Big Data marketplace.
  • Another scenario for these “H2Big Data applications on the HEC would be where Hadoop is hosted external to HEC. This might lead to performance issues inasmuch as data – perhaps large amounts of data - would have to be transferred between the two hosting environments.
  • My assumption is that initially most of these Big Data applications hosted in SAP’s HEC will originate from SAP. I could imagine some sort of certification program for non-SAP Big Data solutions – from customers or partners – would be required before SAP takes responsibility for them as part of the Managed Service. In their own certified HEC environments, partners might wish to host their own Big Data apps.
  • Which Hadoop distribution will be used for these applications? SAP has reseller agreements with Intel and HortonWorks. I assume that there will be a single Hadoop distribution for all Hadoop-based Big Data applications in HEC inasmuch as this would greatly simplify administration and support of those applications.

A big change is coming in data management

$
0
0

As data volumes grow relentlessly, business leaders are acutely aware that relying on old, inaccurate, or inaccessible data is a costly recipe for failure. Instead, they want to seize the advantages that next-generation data management can bring, such as faster response to drive improved customer service, greater efficiency to grow margins, and better decision making to deliver competitive edge. They also know the rewards are great for companies that can rapidly collect meaningful data and rapidly transform it into actionable insight.

 

Businesses need data management platforms that can make sense of an expanding collection of disparate systems – from corporate infrastructure to social media networks. It’s encouraging to learn that, according to an IDG Research Services survey,* their IT departments are open to the big changes that may be required to achieve this.

  

When IDG asked senior and mid-level IT professionals about their data management objectives, two stood out: managing costs and increasing access to real-time data. Neither should come as a surprise. Naturally, the bottom line continues to be a priority. But employees, customers, suppliers, and partners make no allowances for tight
budgets. They want information – and they want it now. Other objectives included providing support for a remote or mobile workforce, enabling better customer responsiveness, working more closely with partners, and responding faster to change.

  

So, what are the biggest challenges facing IT professionals as they try to achieve these objectives? Again, it’s no surprise that cost is top of the list. However, it is followed
by a range of disparate concerns, including data volumes and quality, integration of silos, inadequate staffing, and a range of technical issues, such as scalability, data redundancy, and slow querying and reporting speed.

  

Indeed, it is this latter area, along with total cost of ownership (again), that leads to most dissatisfaction with current data infrastructures. No wonder that nearly half of respondents plan to evaluate new solutions in the next 12 to 24 months.

  

In fact, 27% are willing to move to an entirely new data-management infrastructure. The rise of Big Data applications, the growing popularity of the Apache Hadoop distributed application solution, the growth of unstructured data, and the need for real-time analysis are some of the catalysts. And many companies are weighed down by legacy database infrastructures that just can’t bring them together.

 

Of course, there are barriers to change, but they are not overwhelming. Data management is now a critical strategic issue that outweighs short-term cost savings.

  

So where will they turn? Well, for organizations exploring new data management solutions, there are three main categories, each with its pros and cons:

  • Turnkey hardware-and-software bundles
  • Custom-built solutions
  • Integrated software platforms

 

In fact, the integrated software option is by far the preferred approach of survey respondents. It aligns well with the product portfolio of SAP, which includes the SAP HANA® platform – an in-memory database optimized for near-instantaneous access to and analysis of real-time data.

  

Of course, no big IT change comes without risk – and this is certainly the case when it comes to altering – or even replacing – essential data management systems. But, increasingly, the most dangerous strategy is to do nothing – or to push a legacy system beyond its comfort zone. It is good to see that both IT and business professionals are increasingly willing to make the change.

 

You can learn more at www.sap.com/realtime_data.

  

* IDG Research Services survey of 100 senior and 100 mid-level IT managers from a variety of sectors and a range of company sizes (June 2012).

SAP Operational Process Intelligence @ SAP TechEd 2013

$
0
0

The clock is ticking...

 

28 days later...

Huge turmoil. Crowds of people maneuvering through wide hallways, craving for food, having avoided daylight for days...

 

No, I am not describing a scene from a horror movie with the same name. This is how SAP TechEd 2013 in Las Vegas might look like.

In only 28 days from now, SAP TechEd 2013 will kick off in Las Vegas. We are already looking forward to this event. That is, because we have exciting news to share and the latest and greatest product innovations for you to get your hands on.

 

If Harshavardhan Jegadeesan's blog post announcing What´s new in SAP Operational Process Intelligence SP01 has stirred your interest, I would like to point you at some sessions during this year's SAP TechEd.

 

Lecture and Hands-On Session Overview

 

POP101: Intelligent Processes on SAP HANA – Overview and Outlook

Abstract:

With SAP NetWeaver Business Process Management (SAP NetWeaver BPM), SAP has offered a tool that enables customers to build their own business-user-friendly composite processes, extending the SAP Business Suite, or innovating in areas that have been untapped opportunities for process automation. With SAP NetWeaver BPM's availability on SAP HANA, customers can now leverage synergies to build "intelligent processes", leveraging the power of SAP NetWeaver Process Orchestration and SAP Operational Process Intelligence, to define performance-driven applications with built-in process visibility and integration. This session will provide an introduction to the solution and an explanation of the road ahead.

Schedules:

LocationLink to Session Catalog
Las Vegashttp://sessioncatalog.sapevents.com/go/agendabuilder.sessions/?l=58&sid=8148_34003
Amsterdamhttp://sessioncatalog.sapevents.com/go/agendabuilder.sessions/?l=57&sid=8148_34169
Bangalorehttp://sessioncatalog.sapevents.com/go/ab.sessioncatalog/?l=59&sid=8148_34607

 

POP102: SAP Operational Process Intelligence Powered by SAP HANA

Abstract:

SAP Operational Process Intelligence is a new SAP HANA-based technology solution that enables process participants and lines of business managers to drive the execution of their operational business processes through process visibility and performance management by defining their goals, milestones, and KPI's and measuring the process success along easy-to-understand process phases, simulations, and predictions. SAP Operational Process Intelligence can leverage a variety of operational data providers from SAP and non-SAP to rapidly build your real-time process intelligence solution. This session will provide an overview and demonstrate the capabilities with a live demo.

Schedules:

LocationLink to Session Catalog
Las Vegashttp://sessioncatalog.sapevents.com/go/agendabuilder.sessions/?l=58&sid=8155_34048
Amsterdamhttp://sessioncatalog.sapevents.com/go/agendabuilder.sessions/?l=57&sid=8155_34182
Bangalorehttp://sessioncatalog.sapevents.com/go/ab.sessioncatalog/?l=59&sid=8155_34613

 

POP161: Building a Business Scenario in SAP Operational Process Intelligence

Abstract:

SAP Operational Process Intelligence powered by SAP HANA is a real-time decision-support solution that helps lines-of-business users to gain visibility into their end-to-end processes (BIG processes). In this session, you will play the role of a solution expert and build a business scenario by assembling an end-to.end process, defining high-level phases and milestones, and defining key process indicators. You will also learn the best practices in uncovering visibility needs and specifying a business scenario. In addition, you will learn how to enable SAP Business Suite systems to provision process-related data using process observer.

Schedules:

LocationLink to Session Catalog
Las Vegashttp://sessioncatalog.sapevents.com/go/agendabuilder.sessions/?l=58&sid=7602_34019
Amsterdamhttp://sessioncatalog.sapevents.com/go/agendabuilder.sessions/?l=57&sid=7602_34191
Bangalorehttp://sessioncatalog.sapevents.com/go/ab.sessioncatalog/?l=59&sid=7602_34621

 

POP853: Road Map Q&A: SAP Business Process Management, Rules and SAP OPI

This road map session will provide an overview of the SAP NetWeaver Business Process Management, Rules and Operational Process Intelligence portfolio. It will show where we are and where we are heading to in the areas of process orchestration, mobilizing business processes, operational process intelligence, and SAP HANA.

Schedules:

LocationLink to Session Catalog
Las Vegashttp://sessioncatalog.sapevents.com/go/agendabuilder.sessions/?l=58&sid=9415_34830
Amsterdamhttp://sessioncatalog.sapevents.com/go/agendabuilder.sessions/?l=57&sid=9415_35161
Bangalorehttp://sessioncatalog.sapevents.com/go/ab.sessioncatalog/?l=59&sid=9415_34940

 

Expert Networking Session

Use-case and demo of Operational Process Intelligence powered by SAP HANA

Our partner Owen Pettiford will run two expert networking sessions in both Las Vegas and Amsterdam. In his talks he will discuss the use-cases for SAP Operational Process Intelligence powered by HANA. This tool allows business analysts to build observation models on top on SAP Business Suite, SAP Business Workflow, SAP BPM and 3rd party systems.

Schedules:

LocationLink to Session Catalog
Las Vegas

EXP10677: http://sessioncatalog.sapevents.com/go/agendabuilder.sessions/?l=58&sid=10677

EXP10678: http://sessioncatalog.sapevents.com/go/agendabuilder.sessions/?l=58&sid=10678

Amsterdam

EXP10679: http://sessioncatalog.sapevents.com/go/agendabuilder.sessions/?l=57&sid=10679

EXP10680: http://sessioncatalog.sapevents.com/go/agendabuilder.sessions/?l=57&sid=10680


 

Track and Subtrack Structure

As a rough guidance: all our sessions will be hosted in the Process Orchestration, Integration, and Portal track (POP), more precisely in the Intelligent Processes and Rules (IPR) subtrack. There are more sessions in the IPR subtrack around SAP NetWeaver BPM and SAP NetWeaver BRM Track.

Join SAP at ISC Big Data Conference in Heidelberg

$
0
0

 

pic1.png

If you are planning to attend the inauguralISC Big Data'13 Conferencethen be sure to meet SAP colleagues to learn how SAP can:

  • Enable real-time business insight with SAP HANA®
  • Ignite new business models and revenue streams
  • Uncover  critical business needs and opportunities
  • Learn how companies are using Big Data today

 

 

 

 

 

 

 

Join Stefan Sigg in the session titled  "SAP HANA - Enabling New Business with Big Data Applications"
on September 26, at 1:30 pm – 3:30 pm 

 

stefan.png Stefan is senior vice president for SAP HANA Product and Development. His teams cover all of SAP HANA and Sybase database product management as well as the development of SAP Business Warehouse, SAP Enterprise Performance Management, SAP Enterprise Search, and further application or content oriented capabilities. He is responsible for the overall product strategy and product/customer relationships, shaping the future for this innovative technology within and outside the realm of SAP solutions. In this role, a major focus is put on providing a link between highly relevant business scenarios and the innovation power of the SAP HANA technology platform.

In his session Stefan will present SAP HANA, a main memory-based data management platform and the major categories for using SAP HANA . He will also share customer Big Data scenarios such as providing predictive maintenance services.

 

 

 

 

Meet SAP Big Data Experts at the SAP booth

 

 

Ingo Brenckmann

ingo.jpg

 

 

 

 

Ingo Brenckmann is Senior Development Manager at SAP HANA Product & Strategy at SAP AG. Be sure to connect with Ingo as he explains why speed is essential when it comes to Big Data and how it impacts business. In particular he can share the approach SAP has taken with in-memory computing, and how customers are using Big Data today.  

Ingo has co-authored the book “The SAP HANA Project Guide” in which he shares key findings from SAP HANA projects to help ensure the success of your SAP HANA project as well as guide you to identify suitable scenarios.

 

 

 

 

 

Jan Teichmann

 

 

Jan.png

 

 

 

 

Jan Teichmann works for HANA Product Management on HANA in the Cloud as Development Architect.. He is used to “thinking Cloud” having worked for ByD, checking whether a service consumption is Cloud-like with regards to robustness, deployability, immediate-consume-ability, low entrance and exit thresholds for customers etc. Also, working with TREX and HANA for many years, he knows the strengths of combining OLTP and OLAP reporting, using a Programming Model harnessing the computational power of HANA.

He had started his career at SAP in 1996 in ERP HCM development, getting to know ERP customers, consulting, training, custom development.

Michael Byczkowski

michael.png
Michael Byczkowski is Vice President in the Developer Experience organization. Michael’s key responsibility is to drive adoption of SAP’s latest technologies and foster innovation through developers with customers, partners, and anyone interested. In this role, he is particularly focusing on SAP platforms like NetWeaver Gateway and all its predecessors, both from a technology and a business perspective. He also led the NetWeaver Gateway efforts of the global SAP Ecosystem organization, established a partnering strategy, and developed and implemented a Design Partner Council program for SAP NetWeaver Gateway.

Look forward to joining you at the event

3rd Edition of SAP HANA Essentials - Free Download

VIDEO: Seamlessly Migrate SAP Business Suite to SAP HANA

$
0
0

SAP HANA makes real-time transactions and analytics a healthy reality. In this video, we explain how Savvis, a CenturyLink company, hosting enables flexibility and provide services that makes it simple to migrate SAP Business Suite to SAP HANA while preserving your SAP investments, supercharging your business and simplifying your IT landscape.

 

 

We're happy to chat about your unique situation. If you have any questions about SAP HANA running on Savvis infrastructure, please email us at sap@savvis.com.

A first attempt at using HANA Cloud Integration to integrate Workday and Salesforce

$
0
0

Early this week, I blogged about the release of new documents concerning HANA Cloud Integration (HCI). As I was reading these documents, I thought about the various integrations that might be possible with this new technology. I was feeling a bit naughty and thought – “I wonder if it would be possible to use this technology to integrate Workday and a SAP Back-end”. 

 

Starting to do some research on this idea, I ran across an article about a recent agreement between Workday and Salesforce:

Under a technical agreement and go-to-market strategy, Salesforce and Workday will do the following:

 

  • Salesforce will standardize on Workday's applications.
  • Workday will standardize on Salesforce's applications and platform.
  • Workday will integrate Salesforce with Workday's human capital management software, financials and big data analytics software.
  • Salesforce will integrate Workday into Chatter and other applications. [SOURCE]

Then, I had an even more amusing thought – “What about a Workday – Salesforce integration”.

 

Since the Eclipse-based software for HCI is already available, I decided to see how far I could get with my nefarious – no - heretical plan.

 

Unfortunately, I had a few hurdles to achieve my mischievous plan:

 

  • I would be unable to actually deploy my integration since I didn’t have access to a real tenant in the HCI cloud environment.

In the operation subsystem preferences (WindowPreferencesSAP HANA Cloud IntegrationOperations Server) specify the URL provided to you by SAP.

This meant that I was just restricted to playing with the design-time environment. The runtime environment / operations views with all the cool monitoring stuff wouldn’t be available.

  • I’m not a domain expert in HCM or CRM or a technical expert in Workday or Salesforce– so my integration probably wouldn’t be optimal.

 

Integration

 

Since I saw that both Workday and Salesforce both used SOAP, I started looking for example WSDL files. It turned out to be more difficult than expected but I finally discovered what I needed.  I got the Salesforce WSDL file from an example Java project  and the Workday WSDL file from their public API site

 

Note: I was unable to find publicly available WSDL files for most SAP Cloud applications – it looks like such files are just available for partners.  The one exception is for SuccessFactors where various APIs / WSDL files were found– though I have no idea whether they are still valid or not.

 

A Caveat for PI Developers / Partners:  Prepare to meet an old friend - Eclipse Tools for HCI appear to be very similar to existing Process Integration Eclipse tools. They could even be the same tools. I’m not a PI expert, so I can’t make that judgment call.  This similarity shouldn’t be a reason for disappointment but rather excitement in that existing PI developers / partners can exploit their previous experience to rapidly use this new cloud-based technology.

 

Steps

 

Note: I’m not going to going to describe all the preliminary steps (project creation, etc)  - you’ll find an excellent description in the HCI Developer documentation.

 

1. Import both WSDL files into Eclipse

image001.jpg

 

2. Build a very simple integration process.

 

image002.jpg

 

3. Create a mapping between the two WSDL files

 

image003.jpg

image004.jpg

The WSDL files from each participant were difficult to understand and my mapping between the two WSDL files was valid but not correct in terms of actual / functional associations.

 

Caveat: I’m obviously missing tons of functionality such as authentication, etc


 

4. Deploy to the new integration to my tenant in the cloud

 

image005.jpg

 

Full stop – without a valid tenant I can’t deploy anything. My initial attempt would have failed miserably in the runtime if I had actually tried to start it but with more work, accounts on Workday and Salesforce, etc, I'm sure that eventually it would be functional.


 

Conclusion

 

When I started this exercise, I knew that I wouldn’t be able to use the Runtime features. Yet, I still decided to try out the new functionality. My primary motivation was a response to a comment on an earlier blog on HCI and Mulesoft:

 

 

While SAP will provide (in time) is own integration platform in the cloud, it will primarily be focused on SAP to SAP, similar to Netweaver PI is today.   MuleSoft is one of the early access partners, focused on providing both cloud and ground based integration for SAP to non-SAP.  Today we provide SAP certified connectors to ECC directly, through JCO, a native plug-in for Netweaver PI, as well as cloud connectors for both SAP Sales OnDemand, and SuccessFactors.  [Source]

 

 

This comment implies that with HCI, SAP would concentrate on SAP-SAP integrations – primarily via the predefined content that SAP will provide. Partners – including Mulesoft – would cover SAP-non-SAP content – perhaps via other integration platforms. As my example above shows, HCI is neutral – as a partner, you can use HCI to deploy a variety of integration patterns (SAP -- SAP, non-SAP – SAP, non-SAP – non-SAP). I assume that SAP itself has little interest in providing a Workday – Salesforce Integration, yet this potential is present and demonstrates the exciting promise that the platform will provide for the broader SAP ecosystem.

 

 

Managing Big Data

$
0
0

Big Data is currently the main disruptive force driving many companies to rethink their data strategies. In their new book, The Human Face of Big Data, Rick Smolan and Jennifer Erwitt explain that soon every object on earth will generate data – our homes, our cars, even our own bodies. By gathering and analyzing data on a real-time basis, this new knowledge may lead us to address some of humanity’s biggest challenges, including pollution, world hunger, and illness. They reveal that we are exposed to more data in a single day than our 15th century ancestors did in a lifetime. Smart devices are turning us in to human sensors, as we leave a continuous digital trail of texts, calls, location data and more.*

  

This may feel overwhelming, even threatening. But as our planet begins to develop its own nervous system – of which we are all a part – it’s worth considering how we can benefit from Big Data.

 

First, we should examine what it is. In essence, Big Data comes in three flavours:

  • Structured data generated from business processes.
  • Machine-generated data, such as clickstream or sensor data.
  • Human-sourced data, such as social media, images, audio and unstructured documents.

  

This presents problems for traditional data warehouses, which cannot keep up with new sources of information, new types of data, more complex analysis and greater speed needed. In fact, despite the explosion in data volume, variety and velocity, most data is never actually collected.

  

Fortunately, we have now developed new ways of capturing and using Big Data with in-memory platforms such as SAP HANA® that delivers near-instantaneous access to and analysis of real-time data.

 

SAP HANA allows us to develop new processes and eradicate limitations of the past. This means we can not only use Big Data to see and understand information in more detail, at the moment when we need to see it, but we can also use advanced analytics to discover entirely new ways of doing things.

  

To meet the evolving needs of Big Data projects, many businesses are embracing a hybrid data ecosystem – a flexible data management environment that can comprise of a changing set of technologies, such as operational platforms, cloud-based platforms, Hadoop, analytical platforms, discovery platforms, data marts and data warehouses.

  

You can learn more at www.sap.com/realtime_data

 

*SAP is a proud sponsor of "The Human Face of Big Data", a fascinating look at the affect that Big Data has on humanity. To learn more about this and other stories in the book: http://amzn.to/UhubvR 


Try Out SAP HANA in a Big Data Scenario

$
0
0

Just in time for SAP TechEd, I’m pleased to announcing the public beta of the trial cloud for SAP HANA. Our first released scenario is SAP HANA One and Big Data– Processing Wikipedia Data with Hive and Analyzing with SAP HANA and SAP Lumira.

 

The trial cloud for SAP HANA provides you fast, free access to preconfigured SAP HANA landscapes built for specific use cases. This lets you see how to work with SAP HANA through short tutorial activities.

 

hanalabs_start.png

 

In the case of SAP HANA One and Big Data, we configure an SP6 SAP HANA Server and an associated Windows development system loaded with SAP HANA Studio and SAP Lumira.

 

dev_system.jpg

 

You get access to your trial landscape for four hours to solve four technical exercises. These include:

  1. Loading Hadoop-filtered data from S3 storage into SAP HANA
  2. Creating a Star Schema in SAP HANA One
  3. Building an Analytic View from SAP HANA One Tables
  4. Analyzing Wikipedia Data with SAP Lumira

 

The scenario features the latest release of Lumira Desktop.  We are planning a future revision that makes use of the Smart Data Access feature to generate the Hive query to Elastic Map Reduce on AWS. This will dramatically simplify the data loading process in exercise 1 and 2.

 

Since this is a public beta, we would appreciate your comments on what you like, what else you would like to see in the scenario, and how we can make it better.

 

Sign up for a trial system today!

 

Enjoy!

Meet me at SAP TechEd in Amsterdam

$
0
0

Let's keep this brief and simple:

 

 

So, if you're going to be there as well, this might be a chance to meet in person .

 

Cheers,

Lars

Big Data Webinar - Set-up and use of Smart Data Access in SP6 of SAP HANA

$
0
0

Join the Big Data webinar session this week which will be presented by two SAP mentor: Clint Vosloo and Ethan Jewett- They will share their extensive insight and provide practical advise on how to use Big Data functionalities from SAP to ensure that clean, timely data to feed your reports and analyses

 

 

 

Clint Vosloo has over 16 years IT Industry experience, specializing in the DWH / BI space. Since being actively involved in the first Sybase IQ installation in South Africa back in 1997, he has accumulated a wealth of knowledge and experience using SAP Sybase IQ and Business Objects. His love for Big Data and his passion for bringing the "intelligence" part of BI to the forefront are evident in his ability to assist businesses to make better decisions by designing the correct solutions up front.In 2012, Clint Vosloo was selected as Africa's first ever SAP Mentor. SAP Mentors are the top community influencers of the SAP Ecosystem, and are hands-on experts of an SAP product or service, as well as excellent champions of community-driven projects.He is currently the Managing Partner for APJ for EV Technologies an SAP Gold Partner, SAP Authorized Education Partner, Software Solutions Partner and a Sybase Partner in the SAP Ecosystem. He has been involved in many large scale BI implementations and has been asked to sit on various panels including Dr. Ralph Kimball's panel discussion on Dimensional Modeling. His twitter handle is @vosloo777

 

 

 

Ethan Jewett has been a consultant in business intelligence and data warehousing for 8 years, primarily in the SAP & BusinessObjects world. He is an SAP Mentor (see what that means: http://scn.sap.com/docs/DOC-23155) and an Apache committer (http://people.apache.org/committer-index.html).

 

He is currently working on new approaches to data management problems, consulting on development of data management & visualization tools (SAP & non-SAP), and writing on the topic. His twitter handle: @esjewett and website: http://esjewett.com/

 

 

 

 

Title: Big Data - Set-up and use of Smart Data Access in SP6 of SAP HANA

Abstract:

 

This session provides an independent expert's recommendations for leveraging database and technology functionality from SAP to ensure that clean, timely data is feeding your reports and analyses. This demo-intensive session offers:

  • A deep dive into the capabilities of SAP HANA, SAP Sybase IQ, and SAP Sybase Adaptive Server Enterprise, and examine strengths and tradeoffs based on results of multiple POC projects
  • Instructions and a live demo on how to use NLS (near-line storage) in your SAP BW environment by archiving rarely accessed read only data into SAP Sybase IQ and keeping the hot data in SAP HANA
  • A live demo showing how new data federation capabilities between SAP HANA,SAP Sybase IQ and Hadoop available with SAP HANA SP6 enable dynamic data queries across heterogeneous relational and non-relational database systems

Come away with expert recommendations for optimising SAP's database and technology platform correctly for the SAP BusinessObjects BI tool set.

 

 

Date: 29 October 5-6 pm CET,  9 am PST

 

How to join: Dial-in information can be found in the attached calendar invite

 

 

 

 

Interested in finding out more?

If the webinar inspires you to explore and learn more about Big Data then be sure to check out the document that I maintain as the single source of upcoming and on-demand webinars that are available

 

 

Join the Big Data conversation

 

You can check out the Big Data relevant social resources on the big data website or use the hashtags #SAP #BigData

 

 

 

@rukso

 

 

 

The SAP TechEd Amsterdam 2013 Mobile App - built on SAPUI5 and SAP HANA Cloud

$
0
0

I wanted to tell the story of the Bluefin development team - including DJ Adams, Lindsay Stanger, Oliver Rogers and others, who built an app for you to build an agenda and never miss a session, for SAP TechEd Amsterdam.

 

It was originally built with SAP ERP, Gateway and SAPUI5 with the sap.m library that Fiori uses - so it is Fiori-esque. To productize it, we took advantage of the shiny new HANA Cloud and thanks to Amit Sinha we ported the app onto HANA. It is now built entirely in the HANA platform including HANA DB for storage, HANA XS for integration, and HANA XS with SAPUI5 and sap.m (the same code as before) for the web app.

 

I hope you enjoy it because the team worked hard, and to be clear... none of the credit resides with me, only the responsibility for the bugs because I pushed them to port it to HANA so we could reliably support a large number of users. Please download, and enjoy. In response to a few emails, this app is not sponsored or endorsed by SAP or TechEd.

 

SAP TechEd Amsterdam 2013 Mobile App

Features:

 

- Smartphone, Browser and Desktop compatible

- Search sessions by Type, Track, Level and Date

- Sessions coming up in the next hour

- Sessions in progress

- Add sessions to your agenda

- Favorite sessions

- View your agenda

- "People who like this also like this"

 

If you're interested... here's the story of how it came about.

 

I spoke to DJ Adams about it first. He's currently working for Bluefin within the SAPUI5 team at SAP.

 

"The idea came from when SAP published an Excel spreadsheet of the sessions for TechEd Amsterdam. I though... that data would be lovely in my pocket and I wanted a small example to practice and learn more SAPUI5. The two things came together and I built a small pocket reference to Amsterdam in SAPUI5.

 

The first thing I did was to make the data useable using my Spreadsheet to JSON converter to convert the Excel spreadsheet into Google Docs and pointed my service at this Google Doc. This dynamic service updates the JSON feed every time there's a change to the Excel spreadsheet.

 

At the time I wrote most of it and got Joseph [DJ's son] to write some. What I find really interesting is that I'm a plumber, not a designer - but as a dinosaur I find I can build nice-looking apps. I put it as a challenge to the Bluefin Development Community to make it "TechEd Ready". Lindsay Stanger responded first of all and they took it from there! There wasn't any authentication, or favorites. Lindsay took it over and managed the project from there and I got out the way!"

 

What's the difference between SAPUI5 and Fiori?

 

DJ: "SAPUI5 is a framework built on JQuery which has a number of libraries like sap.ui.commons, sap.viz, and most importantly sap.m, which originally stood for Mobile and it had mobile-specific themes. This has become the go-to standard for building apps and as a result, the sap.m library was built out to become responsive for all device sizes and types including desktop web browsers.

 

Fiori is a set of applications built on the SAP Business Suite. Wave 1 of 25 apps came out earlier this year and Wave 2 is released with TechEd with 25 more apps and there are 150 more waiting in the wings. All of these apps have certain things in common. First they run in HTML5 on smartphones, tablets and desktops. Second, they're role-based which means you logon as yourself with your normal username and password and based on the assignments you have, you will have access to different applications - your launch page is a series of tiles, personal to you.

 

If you take delivery of Fiori, you have to install the UI add-on for NetWeaver, including the UI5 libraries which are stored in the ABAP MIME repository, Your ABAP server can then serve the SAPUI5 artifacts. Then you install Fiori, which has two parts. The first part is the UI part and the second is the Gateway Service for the data that app needs to use, which are all OData services. Each OData service has a front-end part and a back-end part, which integrates into the Business Suite tables.

 

Fiori apps always use the sap.m and sap.ui.layout libraries which contain controls like switches, tables and tab bars, and those are all part of the SAPUI5 framework. This brings consistency of look and feel, plus responsive design."

 

Is the TechEd app you built a Fiori app?

 

DJ: "Erm... Ye... No, but it was never designed to be a Fiori app. It's designed to be for smartphone only. The navigation part of app we built is based on the app control which is part of the sap.m library specifically for smartphones. On a desktop or screen, it stretches to the size of the screen. For larger screens, we have the split app control which contains the master on the left 1/3 and detail on the right 2/3, like Apple Mail. It will automatically use the right framework for the right device, and hide the detail in the smartphone as required. It does however use some of the Fiori controls.

 

Philosophically, Fiori is all about renewing existing apps, and SAP is in the process of defining exactly what Fiori means. Dick Hirsch wrote a good blog about this here. We're even delivering a TechEd hands-on session CD168 "Building SAP Fiori-like UIs with SAPUI5"."

 

fiori.jpg

 

Productionizing the App

 

I talked to Lindsay Stanger, who was responsible for making the app production quality. She is a developer within the Bluefin development team.

 

"At that point, DJ had already made a substantial app where you can browse sessions by type or app. It was pretty static and just had data stored locally. I picked it up and got permission to make it a real project from Philippa Holland. We set out some sprints to move it away from being just an offline app and add the ability to save preferences and logins, and then the ability to create an agenda.

 

The priority list came partially from people within Bluefin who made suggestions. Originally the plan was for me to get used to the technology rather than to release it as an app so we gathered the input from what people thought would be useful. We had 4 or 5 week-long sprints determined - in case I got pulled onto a project, I wanted it to be releasable at the end of any sprint.

 

Sprint 1 allowed you to view all the sessions, filter by type/track/level and add a free text search, plus sort the sessions by date/time. Sprint 2 added date and time for each session so they can be grouped. Sprint 3 added in the login so people could register. Sprint 4 enabled the favorite feature, plus being able to see from a session what other sessions people liked. Sprint 5 enabled adding to an agenda and seeing that in a calendar view, and being able to select days and see what you were booked in for. At the end of Sprint 3, I sent a pilot out to our community to get feedback from what the layout and look-and-feel should be. There was a lot of collaboration from the team there and the feedback was essential!

 

We had a planned Go Live date for about 10 days before TechEd but as it turned out, I finished development about 2 weeks before TE. I sent out the release candidate to our development team and we fixed a few bugs, before getting some help from Steffen Schwark to expose it onto our code website.

 

IT's been amazing the number of people that have helped me, including James Hale, who I work with closely and helped me work through problems with HTML and JavaScript."

 

A brief interlude

 

I had a brief piece of involvement here, when I mentioned that if we want 10,000 TechEd attendees to use this then we should move it out of Bluefin's corporate network and into the cloud. A brief conversation with Amit Sinha, we provided a shiny new HANA Cloud instance to the team. I was personally interested in what it would take to port an app like this from SAP ERP into a HANA native app and I thought that HANA made the perfect platform for building apps quickly, based on SAPUI5 technology.

 

Porting the app to the SAP HANA Cloud

 

Lastly I went to talk to Oliver Rogers, who ported the app to HANA. Oli is a development consultant at Bluefin and a seasoned HANA/mobile developer.

 

"Lindsay had asked me for some assistance around Gateway/JavaScript/SAPUI5 and when I got the request to move it to HANA... the next thing I knew I was doing it. I had to replicate the table structure from ERP over to HANA and then re-write the integration mechanism that Gateway provides into ERP, for the HANA platform.

 

I originally used XSOData to expose information and I found I had to write some Server Side JavaScript (XSJS) to call a stored procedure to create users. The nature of the app is that anyone can read anything - so you can see what anyone else is doing - without authentication. The preferences of saving back on what you're going to attend or favoriting is also written in XSJS, so we can check that it's the correct user. We had used a lot of boolean variables in Gateway, but HANA doesn't support these so we had to rewrite these as single character variables.

 

Provisioning HANA the Cloud was remarkably easy - you just log in and a few minutes later you have a working HANA instance. I connect to it via a web client and it upgrades to the latest version and I was then able to access HANA in the cloud from HANA Studio on my desktop. It just worked."

 

Final Words

 

I absolutely love this story, because it's a story of creating an app around data, which is really relevant and useful to people attending TechEd. It's a story of collaboration and teamwork in the Bluefin team and a story of the latest SAP Technologies, including SAP HANA, Gateway and SAPUI5. I think the app is a beautiful Fiori-esque app which is simple and usable.

 

So please download it and enjoy the team's hard work!

 

Special thanks has to go to the beta testing at the SAP Mentor community, especially Marilyn Pratt, Simon Kemp and Gregor Wolf, who all found bugs which we quickly fixed today.


SAP HANA Cloud Integration - POV 1 (of 2)

$
0
0

SAP TechEd Las Vegas 2013 provided an awesome opportunity to meet the SAP HANA Cloud Integration (HCI) Product Management team and discuss with them the current status and future direction of HCI. Here is the very knowledgeable (and happy) HCI Product Management team meeting SAP Mentors.

2013-10-24 12.30.08.jpg

(Left to Right: Sindhu Gangadharan, Udo Paltzer, Prakash Arunachalam, Subha Ramachandran. Missing in the picture: Holger Kunitz and Ginger Gatling)

 

I was adequately satisfied with the conversation with the team and now that TechEd is over, have started gathering all the random thoughts generated during the discussions, sessions and hands-on about HCI. I soon realized that there were too things to write about HCI and I might not succeed in keeping them well organized. So I would like to apologize in advance for this attempt. With this blog,

 

Although there are already some blogs on SCN talking about HCI, I thought about writing this blog as an 'Integration Architect' perspective which might help potential customers to think about using it as their strategic integration solution (and probably SAP). HCI is the cloud based integration solution (earlier known as NetWeaver Cloud Integration) introduced by SAP earlier this year. The 'Application Edition' of HCI is available since March 2013 providing specific solutions like SuccessFactors, Financial Services Network (FSN) and S&OP (Sales and Operations Planning). Starting August, customers have ability to extend these packages e.g. with their custom mappings. The next interesting step would be to see the 'Platform Edition' of HCI planned for Q1 2014 providing any to any integration capabilities. You may find most of the SAP published HCI documents here.

 

HCI Behind the scene:

SAP see and promote HCI as a part of SAP HANA Cloud platform offering. Although this clarifies SAP's intent of using HANA as the backbone of HCI, the current HCI offering doesn't entirely uses HANA in the background - and there is a valid reason behind it.

 

As you might know, HCI is available in two flavors - Process Integration and Data Integration. Out of these two, the Data Integration component does leverages the HANA in the cloud. However, the architectural requirement of HCI's Process Integration component prevents the 'current' HANA cloud platform to be its ideal choice. {I will try to explain how I understood the reason behind} Process Integration requires data persistence for certain scenario and therefore replication of this data for disaster recovery is of utmost importance. The current HANA cloud architecture does not allow this replication in the cloud and therefore HCI currently uses the next best candidate 'Sybase Replication Server' for its Process Integration capabilities. HCI Process Integration will eventually move to a stabilized replication and disaster recovery enabled HANA cloud.

 

Opinion:

  • These are the technical back-end details managed entirely by SAP currently and for an HCI customer, there is no significant impact in terms of integration capabilities, implementation time and runtime.

 

Design Time:

If you have seen the 7.3x version of PI or PO on NWDS, you will realize that the HCI design time uses a similar Integration Designer perspective (both uses Eclipse based development environment). Integration Flows (iFlows) are the most crucial design component of HCI as it is with PI 7.3x. The integration flows uses BPMN 2.0 based notation. As soon as you noticed this similarity, there is a natural tendency (may be it was just me ;)) of comparing HCI design time with PI. I would suggest simply 'not to do it'.

 

A crucial point to understand is that architecturally, HCI is completely different that PI and is 'not' PI on cloud. The design time is 'made similar' to that of PI as this will enable the larger PI community to leverage their experience while using HCI.

IFL.jpg

 

This doesn't mean that the design times of HCI and PI are completely incompatible. The mappings that are available or created in HCI can be imported in PI as operation maps and this was confirmed by SAP. So why wouldn't it be technically possible for the other way - moving PI operation maps to HCI. The answer is it should be but with some limitations. Currently, there are several features that HCI design time lacks (and I am very sure SAP is working on it). If a feature used in PI mapping is unavailable in HCI, then obviously the imported mapping in HCI will be incorrect.

 

Opinion:

  • I don't know if SAP plans to bring the HCI design time at par with PI, but it would be a good idea in my opinion and if they do, they should start creating and sharing the compatibility matrix between the two design times.
  • There might be customer with small PI landscape planning to move to HCI completely in future. If SAP thinks this number is significant, there should be a migration strategy made available for these customers.
  • There are some missing features that should be included. e.g. navigating the message structure while providing conditions in case of multiple receiver. Currently, we have to manually type it.
  • The Enterprise Integration Patterns provides a good template for building the Integration flows and helps reducing the development time.
  • Apart from using Eclipse based Integration Designer perspective for modelling, you may the use WebUI available for configuring Data Integration as well. For a larger landscape with multiple developers, this can create a version management issue as developers can in parallel make changes at both the places. Changes deployed on one tool will simply overwrite the changes on the other without any warning or error. Currently, there are no version management guidelines provided for the possible version conflict in this case.

 

 

Pre-Packaged Content:

Providing pre-packaged design time content has always worked well for SAP and in my opinion is always an excellent idea. We have witnessed its success with SAP PI and PO. HCI is providing pre-packaged content for various data and process integration solution it is offering and SAP has planned to add more content for various other solutions like Ariba. Here is an overview of the pre-packaged content available. I liked the statement "Fully supported, certified and tested"

2013-10-22 11.48.12.jpg

 

You may check the complete list of pre-packaged content catalog here using your SCN ID.

 

The feedback about this HCI catalog (and I hope for the entire product) can be provided to HCI team at hcifeedback@sap.com.

 

Opinion:

  • Please voice out the upcoming content if it is planned and on track for delivery soon. This will help potential customers in planning for a long term cloud integration solution. e.g. A customer choosing Dell Boomi for Ariba integration today won't be happy to realize that in Q1 2014 they will have similar capabilities available with HCI.
  • An option to certify and develop pre-packaged content should be available for customers and partners.

 

 

.... In order to accommodate my extended POV and make the blog more readable, I am splitting the content into two. Check out the 2nd part here.

Viewing all 927 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>