Careers at Orzota

Big data careers

At Orzota, our mission is to help businesses gain insights from data. We are based in Silicon Valley with the founders having worked in premiere technology companies such as Sun, Yahoo! and Netflix.Our strong engineering credentials have helped us build a solid team of exceptional Big Data professionals. If you are passionate and self-motivated, with a desire to have a Big Data career, come join us. We ensure all our employees have the right training and skills to advance their big data careers.

Do you want to be part of the Big Data ecosystem?

If you are self-motivated, passionate and want to see big data technology used to solve complex problems, we are looking for you. If you are an experienced architect or developer working in technologies such as Hadoop, Spark, Cassandra, etc. and are looking for your next big challenge, you have come to the right place. If you are just starting your engineering career and Big Data technologies sound fun and exciting, you have also come to the right place. Orzota specializes exclusively in Big Data and NoSQL technologies; building products and solutions to manage and analyze Big Data; and providing Corporate training in these technologies. Our geographically distributed team comprises of self-driven individuals who are passionate about making a difference.

careers at Orzota

Our unique work environment provides rapid training for the new college grad while working on real projects while senior engineers will find challenging, complex problems to be solved and delivered: both within the company as well as for our clients.

India Job Openings

In India, our Engineering Offices are located in Chennai. Customer-facing roles may require travel and/or re-location which will be called out in the specific job posting.

ORZOTA – FULL STACK ENGINEER - JAVA/SCALA/PYTHON AND WEB APPLICATIONS (4 TO 7 YRS)

Responsibility
You must be a fast learner and have the ability to turn-around a small project quickly. Although you will have technical guidance and mentors to guide you, we are really looking for a candidate who has the ability and self-confidence to solve problems on their own.

If you are looking for experience with a wide variety of technologies around the big data eco-system, this is a great opportunity
LOCATION: Chennai, India

Requirements

  • Hands-on experience with at least some aspects of big data engineering: data ingestion of various types of sources, common data cleansing and transformation techniques.
  • Experience developing web applications – REST APIs, javascript, html, css as well as back-end Java applications. Experience using frameworks such as D3, Play a big plus.
  • Ability to develop, debug and deploy applications using common IDEs and agile methodologies is required.
  • Good programming skills in Java/Scala and python a must.
  • Knowledge of Linux and scripting at a level that you can interact with systems comfortably is required.

Skills

  • Required – Javascript, Html, CSS, Java, Python, Hadoop, Linux
  • Preferred – Java Enterprise Edition

Apply Now

BIG DATA DEVELOPER

We are building Big Data platforms and solutions, and we would be interested in hearing from candidates who would be excited to be working on Hadoop and associated technologies. We offer an exciting and dynamic work environment that is unique to start-ups. The Orzota Team believes in challenging each other to explore newer tech terrains and is highly proud of its learning environment!
We are looking for people with 1 to 4 years of experience.

Degree:  Bachelors or Masters Degree in in computer science or computer engineering or equivalent professional experience
Location: Chennai, INDIA

Responsibilities and Expectations –

  • Design and develop applications/scripts under tight deadlines with minimal supervision
  • Define, articulate and translate technical designs with the appropriate details to business and technical teams
  • Develop project construction, test, and deployment plans
  • Take ownership, coordinate and complete project tasks
  • Participate in technical reviews throughout the course of development

Required Skills:

  • Experience in Java and willing to work in the Big Data Area.
  • Experience with Python tools such as iPython, easy_install, pdb, and pip
  • Broad range of experience with Python frameworks such as Django, and Python Stomp client

Desired Skills:

  • Experience with automated VM provisioning and Amazon Web Services (AWS)/Elastic Cloud (EC2) or related services a plus
  • Experience using source control, GIT preferred
  • Experience with any of the Big Data Technologies would be given precedence

Apply Now

ORZOTA - HADOOP/LINUX ADMINISTRATOR - DEVOPS/CLIENT SERVICES (2-5 yrs)

Responsibilities:

  • Should take an ownership and be responsible for Architecting, Designing, Implementation and Administration of Hadoop infrastructure.
  • Should have prior experience with Hadoop administration, including:
    • Setup, configure and maintain the Hadoop clusters including components such as Kafka, Storm, Hive, Pig, Spark, HDFS, HBase, Oozie, Sqoop, Flume, Zookeeper, etc.
    • Provide support for data integration and ETL pipelines.
    • Provision users and groups, ACL permissions, Kerberos and other requirements for security.
    • Monitor, diagnose problems and fix issues.
    • Automate tasks and operations via scripting.

If interested kindly mention your :

  • Full Name :
  • Total Exp :
  • Current Company :
  • Current CTC :
  • Expected CTC :
  • Notice Period : (days)
  • Current Location :
  • Current Employment Status(Permanent/Contract) :
  • Holding any Offers (Yes/No) :
  • Willing to relocate to Mumbai (Yes/No) :

Requirements:

  • Minimum 1 to 2 years of Hadoop Administration Experience
  • Minimum 3 years of Linux Administration Experience
  • Good communication skills
  • 2 to 5 years of experience required as devops, Linux administrator, etc.
  • Required Expertise : Linux, BASH, HDP
  • Cloud administration on AWS or Azure desirable
  • Should have managed on-prem infrastructure as well

Location: Mumbai
Apply Now

US Job Openings

Our offices in the USA are located in the San Francisco Bay Area, CA and Houston, TX. Most US positions will require travel and/or relocation to a client location.

HADOOP DEVOPS/ADMINISTRATOR

The qualified candidate will have the following experience:

  • Minimum of 1 year of Hadoop stack experience, preferably both as a developer and administrator. Administration a must. Set-up/tear-down, configuration management and debugging problems
  • Knowledge of monitoring and debugging apps in the Cloudera/Hortonworks environment. Understanding of how Hadoop works, tracking errors and performance issues.
  • At least 1-2 years of Linux administration. Understanding of networking, security, etc. Knowledge of puppet/chef, Scripting via shell/perl

Required experience: Linux administration: 1-2 years
Required education:  Bachelor’s of Engineering/Computer Science

Apply Now

BIG DATA ENGINEER/TRAINEE

This position will require re-location to client locations after appropriate training. As a data engineer, you should be familiar with and have hands-on experience with all aspects of big data engineering from data ingestion of various types of sources and common data cleansing and transformation techniques. Alternatively, you must have 3-4 years of strong Java/J2EE, database and data warehouse (ETL, data modeling, analytics) experience. Ability to develop, debug and deploy applications using common IDEs in a Linux environment is required.Required experience:

  • Big Data engineer: 1+ years
  • Java/J2EE developer: 3+ years

Required education:
Minimum of Bachelor’s Degree in Engineering, Master’s preferred

Apply Now

What you will need:

  • Strong experience with object-oriented design, coding and testing patterns
  • Experience building big data solution using Hadoop technologies
  • 2+ years Java development experience
  • Strong knowledge of Linux and scripting
  • Solid understanding of SQL and experience Hands on experience using Map/Reduce, Hive, and/or Pig
  • Experience using one or more of Kafka, Storm, Flume
  • Understanding of HDFS, Map/Reduce, HBase
  • Experience with stream processing technologies such as Storm/Spark is a plus
  • Experience with other technology such as amazon ec2 is a plus
  • Ability to quickly triage and troubleshoot complex problems. A strong team player

SENIOR SOFTWARE ENGINEER

Job Description:

  • Leverage knowledge and expertise of big data technologies, data sources, pipelines and processes to provide recommendations to the company for new solution offerings and enhancements to existing ones.
  • Understand the business and technical constraints faced by customers so we can better able to assist them in using big data and related technologies.
  • Working with architects and business leaders, design the applications to meet stated business and technical goals using Orzota’s pre-built solutions and solution accelerators and appropriate technologies such as Hadoop, Spark, Kafka, Cassandra, Tableau, BI tools, etc.
  • Create design documents, articulate design decisions in presentations and meetings to get buy-in while also incorporating feedback. Customers evaluate us not just based on technical merits but also our ability to consider their requirements and feedback.
  • Create prototypes, lead proofs of concepts projects, evaluate options with pros and cons and provide recommendations to customers.
  • Implement applications, work in collaboration with technology partners and offshore engineers to customize the necessary modules and software. Ensure quality of all project deliverables.
  • Leverage experience in developing distributed and big data applications to setup milestones for agile/SCRUM development.
  • Work with QA team to provide knowledge and understanding of applications and modules for proper test development by them.
  • When a project is complete, ensure smooth transition by providing necessary training on all aspects of the applications including design, implementation, deployment, operations, monitoring and tuning.
  • As an expert on big data technologies and company solutions, help our sales team with delivering technical demos of our solutions to customers.
  • As a Senior Software Engineer, mentor other junior engineers to resolve technical difficulties.

Qualifications:

  • Bachelor’s/Master’s in Computer Science or Computer Engineering.
  • At least 10+ years experience in software including databases, data warehousing, BI.
  • 2+ years experience designing and developing applications using big data components like: Hadoop, Spark and NoSQL databases like Cassandra and/or HBase.
  • Programming experience in SQL, Java and Scala.
  • Proven ability to lead both on-shore and off-shore teams in the development of big data solutions.
  • Deep expertise in the principals and architecture of various technologies within the big data stack is required.
  • Experience implementing end-to-end big data solutions across multiple technologies and platforms.
  • Ability to communicate clearly and effectively in interpersonal and written formats.
  • Familiarity with agile development and DevOps best practices.

Apply Now

SOFTWARE ENGINEER

Proposed Job Duties:

  • Develop required analytics components for Orzota Solutions.
  • Integrate Big Data technologies such as Cassandra, H2O, Deep Learning, etc. into the Orzota Platform.
  • Build streaming solutions using Kafka, Storm, Spark Streaming for IOT applications.
  • Build fault tolerant, self-healing, adaptive and highly accurate data computational pipelines for the Orzota Customer Knowledge and other Solutions.
  • Work as a member of Orzota’s global development team, taking part in regular staff meetings, daily standups for agile sprints and help the company build world class products.

Qualifications:

  • Bachelors or Masters Degree in in computer science or computer engineering
  • At least 1-4 years of experience
  • Programming experience in SQL, Java, Python, R, Scala etc.
  • Experience using open source big data components like: Hadoop, Spark, Kafka, Cassandra etc.

Apply Now

apss_content_flag:
0
vidbg_metabox_field_mp4_id:
0
vidbg_metabox_field_webm_id:
0
vidbg_metabox_field_poster_id:
0
vidbg_metabox_field_overlay:
off
vidbg_metabox_field_overlay_color:
#000
vidbg_metabox_field_overlay_alpha:
30
vidbg_metabox_field_no_loop:
off
vidbg_metabox_field_unmute:
off