Careers at Orzota

Big data careers

At Orzota, our mission is to help businesses gain insights from data. We are based in Silicon Valley with the founders having worked in premiere technology companies such as Sun, Yahoo! and Netflix.Our strong engineering credentials have helped us build a solid team of exceptional Big Data professionals. If you are passionate and self-motivated, with a desire to have a Big Data career, come join us. We ensure all our employees have the right training and skills to advance their big data careers.

Do you want to be part of the Big Data ecosystem?

If you are self-motivated, passionate and want to see big data technology used to solve complex problems, we are looking for you. If you are an experienced architect or developer working in technologies such as Hadoop, Spark, Cassandra, etc. and are looking for your next big challenge, you have come to the right place. If you are just starting your engineering career and Big Data technologies sound fun and exciting, you have also come to the right place. Orzota specializes exclusively in Big Data and NoSQL technologies; building products and solutions to manage and analyze Big Data; and providing Corporate training in these technologies. Our geographically distributed team comprises of self-driven individuals who are passionate about making a difference.

careers at Orzota

Our unique work environment provides rapid training for the new college grad while working on real projects while senior engineers will find challenging, complex problems to be solved and delivered: both within the company as well as for our clients.

India Job Openings

In India, our Engineering Offices are located in Chennai. Customer-facing roles may require travel and/or re-location which will be called out in the specific job posting.


You must be a fast learner and have the ability to turn-around a small project quickly. Although you will have technical guidance and mentors to guide you, we are really looking for a candidate who has the ability and self-confidence to solve problems on their own.

If you are looking for experience with a wide variety of technologies around the big data eco-system, this is a great opportunity
LOCATION: Chennai, India


  • Hands-on experience with at least some aspects of big data engineering: data ingestion of various types of sources, common data cleansing and transformation techniques.
  • Experience developing web applications – REST APIs, javascript, html, css as well as back-end Java applications. Experience using frameworks such as D3, Play a big plus.
  • Ability to develop, debug and deploy applications using common IDEs and agile methodologies is required.
  • Good programming skills in Java/Scala and python a must.
  • Knowledge of Linux and scripting at a level that you can interact with systems comfortably is required.


  • Required – Javascript, Html, CSS, Java, Python, Hadoop, Linux
  • Preferred – Java Enterprise Edition

Apply Now


We are building Big Data platforms and solutions, and we would be interested in hearing from candidates who would be excited to be working on Hadoop and associated technologies. We offer an exciting and dynamic work environment that is unique to start-ups. The Orzota Team believes in challenging each other to explore newer tech terrains and is highly proud of its learning environment!
We are looking for people with 1 to 4 years of experience.

Degree:  Bachelors or Masters Degree in in computer science or computer engineering or equivalent professional experience
Location: Chennai, INDIA

Responsibilities and Expectations –

  • Design and develop applications/scripts under tight deadlines with minimal supervision
  • Define, articulate and translate technical designs with the appropriate details to business and technical teams
  • Develop project construction, test, and deployment plans
  • Take ownership, coordinate and complete project tasks
  • Participate in technical reviews throughout the course of development

Required Skills:

  • Experience in Java and willing to work in the Big Data Area.
  • Experience with Python tools such as iPython, easy_install, pdb, and pip
  • Broad range of experience with Python frameworks such as Django, and Python Stomp client

Desired Skills:

  • Experience with automated VM provisioning and Amazon Web Services (AWS)/Elastic Cloud (EC2) or related services a plus
  • Experience using source control, GIT preferred
  • Experience with any of the Big Data Technologies would be given precedence

Apply Now

US Job Openings

Our offices in the USA are located in the San Francisco Bay Area, CA and Houston, TX. Most US positions will require travel and/or relocation to a client location.


The qualified candidate will have the following experience:

  • Minimum of 1 year of Hadoop stack experience, preferably both as a developer and administrator. Administration a must. Set-up/tear-down, configuration management and debugging problems
  • Knowledge of monitoring and debugging apps in the Cloudera/Hortonworks environment. Understanding of how Hadoop works, tracking errors and performance issues.
  • At least 1-2 years of Linux administration. Understanding of networking, security, etc. Knowledge of puppet/chef, Scripting via shell/perl

Required experience: Linux administration: 1-2 years
Required education:  Bachelor’s of Engineering/Computer Science

Apply Now


This position will require re-location to client locations after appropriate training. As a data engineer, you should be familiar with and have hands-on experience with all aspects of big data engineering from data ingestion of various types of sources and common data cleansing and transformation techniques. Alternatively, you must have 3-4 years of strong Java/J2EE, database and data warehouse (ETL, data modeling, analytics) experience. Ability to develop, debug and deploy applications using common IDEs in a Linux environment is required.Required experience:

  • Big Data engineer: 1+ years
  • Java/J2EE developer: 3+ years

Required education:
Minimum of Bachelor’s Degree in Engineering, Master’s preferred

Apply Now

What you will need:

  • Strong experience with object-oriented design, coding and testing patterns
  • Experience building big data solution using Hadoop technologies
  • 2+ years Java development experience
  • Strong knowledge of Linux and scripting
  • Solid understanding of SQL and experience Hands on experience using Map/Reduce, Hive, and/or Pig
  • Experience using one or more of Kafka, Storm, Flume
  • Understanding of HDFS, Map/Reduce, HBase
  • Experience with stream processing technologies such as Storm/Spark is a plus
  • Experience with other technology such as amazon ec2 is a plus
  • Ability to quickly triage and troubleshoot complex problems. A strong team player