Hadoop Solutions Architect Engineering at Geebo

Hadoop Solutions Architect

Company Name:
Valleysoft Inc
Possession of fundamental skills of a solution architect with ability to inquire and resolve vague requirements
Ability to analyze existing systems through an interview process with technology SMEs to develop a complete end-to-end Current State Architecture to drive and define the Future State Architecture
Takes a holistic view and communicate to others to be able to understand the enterprise view to ensure coherence of all aspects of the project as an integrated system
Perform gap analysis between current state and the future state architecture to identify single points of failure, capabilities, capacity, fault tolerance, hours of operation (SLA), change windows,
Strong verbal and written communication with proven skills in facilitating the design sessions.
Able to influence, conceptualize, visualize and communicate the target architecture.
Able to communicate complex technical or architecture concepts in a simple manner and can adapt to different audiences
Ability to work independently and industry architecture best practices, guidelines, standards, principles, and patterns while working with infrastructure and technology project teams
Ability to document end-to-end application transaction flows through the enterprise
Ability to document the technology architecture decision process, and if required develop relevant templates
Resolve conflicts within infrastructure and application teams, business units and other architects
Identify opportunities to cut cost without sacrificing the overall business goals.
Ability to estimate the financial impact of solution architecture alternatives / options
Knowledge of all architecture capabilities that are deemed necessary for of an Enterprise Technical Architecture.
Proficiency in Big data solutions such as Hadoop/HDFS, NoSQL databases such as HBase, Cassandra, couchDB, Accumulo and tools such as HIVE, PIG, Crunch, along with strong background in architecting Big data deployments in HealthCare and Retail space.
Preference will be given to candidates who have exposure to Hortonworks distribution with HDFS, PIG, HIVE, Crunch, Spark, Accumulo, Crunch, Spark, Storm tools.
Knowledge of RDBMS e.g. Netezza, Oracle, DB2 is preferred. Knowledge of integration tools such as Ab-Initio and Big data integration tools such as Sqoop, Flume etc. are also preferred. Knowledge of front end and visualization tools such as Pentaho and Tableu is good to have but any kind of reporting tool infrastructure e.g. Business objects or Cognos will be preferable.
Knowledge of analytical tools such as R/SAS and Big data analytical tools such as Mahout is good to have. We are not seeking individuals who are proficient with the actual analytical algorithms but we are looking for individuals who know the impact of these analytical solutions on a Big data platform and s/he should have knowledge about best practices to design the system for analytical work load.
If You are interested please send your resume to
Or direct apply here http://jobsbridge.com/JobSearch/View.aspx?JobId=30375Estimated Salary: $20 to $28 per hour based on qualifications.

Don't Be a Victim of Fraud

  • Electronic Scams
  • Home-based jobs
  • Fake Rentals
  • Bad Buyers
  • Non-Existent Merchandise
  • Secondhand Items
  • More...

Don't Be Fooled

The fraudster will send a check to the victim who has accepted a job. The check can be for multiple reasons such as signing bonus, supplies, etc. The victim will be instructed to deposit the check and use the money for any of these reasons and then instructed to send the remaining funds to the fraudster. The check will bounce and the victim is left responsible.