Careers at Orzota

Big data careers

At Orzota, our mission is to make Big Data easy for consumption. We are based in Silicon Valley with the founders having worked in premiere technology companies such as Sun, Yahoo! and Netflix.Our strong engineering credentials have helped us build a solid team of exceptional Big Data professionals. If you are passionate and self-motivated, with a desire to grow a career in Big Data, come join us. We ensure all our employees have the right training and skills to advance their careers

Do you want to be part of the Big Data ecosystem?

If you are self-motivated, passionate and want to see big data technology used to solve complex problems, we are looking for you. If you are an experienced architect or developer working in technologies such as Hadoop, Spark, Cassandra, etc. and are looking for your next big challenge, you have come to the right place.On the other hand, if you are just starting your engineering career and Big Data technologies sound fun and exciting, you have also come to the right place.Orzota specializes exclusively in Big Data technologies; building products and solutions to manage and analyze Big Data while helping our clients with these complex technologies by implementing custom projects for them.Our founders are deep technologists, having worked in engineering roles in leading Silicon Valley companies such as Sun Microsystems, Oracle, Yahoo! and Netflix. Our geographically distributed team comprises of self-driven individuals who are passionate about making a difference.

careers at Orzota

Our unique work environment provides rapid training for the new college grad while working on real projects while senior engineers will find challenging, complex problems to be solved and delivered: both within the company as well as for our clients.

India Job Openings



  • Should take ownership and be responsible for Architecting, Designing, Implementation and Administration of Hadoop infrastructure
  • Should work with Dev and UI / Ux teams to guarantee high data quality and availability.
  • Should be able to plan for Back-up and High Availability
  • Should have exposure to building secured hadoop infrastructure

  • Experience in Hadoop as Administration
  • Experience in Linux Administration
  • Willing to work in rotational shifts, if required
  • Good communication skills
  • 2 t o 3 years experience required
  • Should be able to join immediately
  • Required Expertise: Linux, BASH, Python, HDP, AWS, Azure

Apply Now



  • Should take ownership and be responsible for Architecting, Designing, Implementation and Administration of Hadoop infrastructure
  • Should have prior experience with Hadoop administration, including:
    • Setup, configure and maintain the Hadoop clusters including components such as Kafka, Storm, Hive, Pig, Spark, HDFS, HBase, Oozie, Sqoop, Flume, Zookeeper, etc.
    • Provide support for data integration and ETL pipelines
    • Provision users and groups, ACL permissions, Kerberos and other requirements for security
    • Monitor, diagnose problems and fix issues
    • Automate tasks and operations via scripting

  • Experience in Hadoop as Administration
  • Experience in Linux Administration
  • Good communication skills
  • 2 to 5 years experience required as devops, Linux administrator, DBA, etc.
  • Required Expertise: Linux, BASH, HDP
  • Cloud administration on AWS or Azure desirable
  • Should have managed on-prem infrastructure as well

Apply Now


You must be a fast learner and have the ability to turn-around a small project quickly. Although you will have technical guidance and mentors to guide you, we are really looking for a candidate who has the ability and self-confidence to solve problems on their own.

If you are looking for experience with a wide variety of technologies around the big data eco-system, this is a great opportunity
LOCATION: Chennai, India


  • Hands-on experience with at least some aspects of big data engineering: data ingestion of various types of sources, common data cleansing and transformation techniques.
  • Experience developing web applications – REST APIs, javascript, html, css as well as back-end Java applications. Experience using frameworks such as D3, Play a big plus.
  • Ability to develop, debug and deploy applications using common IDEs and agile methodologies is required.
  • Good programming skills in Java/Scala and python a must.
  • Knowledge of Linux and scripting at a level that you can interact with systems comfortably is required.


  • Required – Javascript, Html, CSS, Java, Python, Hadoop, Linux
  • Preferred – Java Enterprise Edition

Apply Now



  • Can build the responsive or adaptive screens from visual designs using HTML and CSS3
  • Able to translate high-level requirements into interaction flows and artifacts, and transform them into beautiful, intuitive, and functional user interfaces.​
  • Translates designs and style guides provided by the UI/UX team into functional user interfaces, ensuring cross browser compatibility and performance. 
  • Contributes to continual improvement by suggesting improvements to user interface, software architecture or new technologies.  
  • In depth knowledge of Javascript , Jquery , Bootstrap
  • Good to have knowledge in SASS

Js & Frameworks

  • Highly skilled at front-end engineering using Object-Oriented JavaScript, various JavaScript libraries and micro frameworks (like jQuery, Angular, Prototype, Dojo, Backbone, YUI)
  • Web User Interface programming using JavaScript, AJAX, JSON, XML, HTML5, CSS, JQUERY.
  • Exposure to other frameworks like Angular.js, React.Js, Ext.Js, DOJO etc would be a plus

MVC & Development

  • Should have experience in PHP or Python
  • Good backend MVC(rails/django/angular+node) exposure
  • Builds software applications, follows coding standards, builds appropriate unit tests, integration tests and deployment scripts
  • Design and implement Restful services


  • Experience with relational database (mysql / postgresql / oracle / SQLite) / NoSQL database.
  • Able to design schema and experience in writing views / procedures.


  • Good Communication and Project Management Skills
  • Self-directed team player who thrives in a fluid environment.   
  • Very good Logical skills and Problem solving ability
  • Quick Learner and a good team player

Apply Now


We are building Big Data platforms and solutions, and we would be interested in hearing from candidates who would be excited to be working on Hadoop and associated technologies. We offer an exciting and dynamic work environment that is unique to start-ups. The Orzota Team believes in challenging each other to explore newer tech terrains and is highly proud of its learning environment!
We are looking for people with 1 to 4 years of experience.

Degree:  Bachelors or Masters Degree in in computer science or computer engineering or equivalent professional experience
Location: Chennai, INDIA

Responsibilities and Expectations –

  • Design and develop applications/scripts under tight deadlines with minimal supervision
  • Define, articulate and translate technical designs with the appropriate details to business and technical teams
  • Develop project construction, test, and deployment plans
  • Take ownership, coordinate and complete project tasks
  • Participate in technical reviews throughout the course of development

Required Skills:

  • Experience in Java and willing to work in the Big Data Area.
  • Experience with Python tools such as iPython, easy_install, pdb, and pip
  • Broad range of experience with Python frameworks such as Django, and Python Stomp client

Desired Skills:

  • Experience with automated VM provisioning and Amazon Web Services (AWS)/Elastic Cloud (EC2) or related services a plus
  • Experience using source control, GIT preferred
  • Experience with any of the Big Data Technologies would be given precedence

Apply Now

US Job Openings


The qualified candidate will have the following experience:

  • Minimum of 2 years of Hadoop stack experience, preferably both as a developer and administrator. Administration a must. Set-up/tear-down, configuration management and debugging problems
  • Knowledge of monitoring and debugging apps in the Cloudera/Hortonworks environment. Understanding of how Hadoop works, tracking errors and performance issues.
  • At least 1-2 years of Linux administration. Understanding of networking, security, etc. Knowledge of puppet/chef, Scripting via shell/perl

Required experience: Hadoop administration: 2 years
Required education:  Bachelor’s of Engineering/Computer Science

Apply Now


As a key member of a Big Data team, you will be responsible for designing and developing major components of big data stream and batch processing applications. As a data engineer, you should be familiar with and have hands-on experience with all aspects of big data engineering from data ingestion of various types of sources and common data cleansing and transformation techniques. Ability to develop, debug and deploy applications using common IDEs and agile methodologies is required.

Required experience:

  • Big Data engineer: 1+ years
  • Java/J2EE developer: 2+ years

Required education:
Minimum of Bachelor’s Degree in Engineering, Master’s preferred

Apply Now

What you will need:

  • Strong experience with object-oriented design, coding and testing patterns
  • Experience building big data solution using Hadoop technologies
  • 2+ years Java development experience
  • Strong knowledge of Linux and scripting
  • Solid understanding of SQL and experience Hands on experience using Map/Reduce, Hive, and/or Pig
  • Experience using one or more of Kafka, Storm, Flume
  • Understanding of HDFS, Map/Reduce, HBase
  • Experience with stream processing technologies such as Storm/Spark is a plus
  • Experience with other technology such as amazon ec2 is a plus
  • Ability to quickly triage and troubleshoot complex problems. A strong team player