Principal Software Engineer - Data at Netskope
San Francisco, CA, US
We are looking for skilled engineers with an eye for building and optimizing distributed systems to join our team. From data ingestion, processing to edge computing, we work closely with other engineers and Product Management to build consistent and highly available systems that tackle real world data problems. Our customers both internal and external depend upon us to provide accurate, real-time, and fault tolerant solutions to their ever growing data needs. The senior engineer position is a highly technical position with responsibility to lead the development, validation, publishing, and maintenance of logical and physical data models which support various OLTP and analytics environment.
 
Role and Responsibilities:
  • Designs and implements cloud scale distributed data-focused Data Platform, services and frameworks including solutions to address high-volume and complex data collection, processing, transformation and reporting for analytical purposes
  • Work with the application development team to implement data strategies, build data flows and develop conceptual data models
  • Understand and translate business needs into data models supporting long-term solutions
  • Develop best practices for standard naming conventions and coding practices to ensure consistency of data models
  • Analyze data-related system integration challenges and propose appropriate solutions
  • Research to identify effective data designs, new tools and methodologies for data analysis
  • Provide guidance and expertise to development community in effective implementation of data models and building better data access services
  • Work through all stages of a data solution life cycle: analyze/profile data, create conceptual, logical & physical data model designs, derive reporting and analytics solutions
  • Evaluate existing data and physical databases for variances and discrepancies
  • Participate in data strategy and road map exercises, data architecture definition, business intelligence, data lake tool selection, design and implementation
  • Provide technical leadership in all phases of a project from discovery and planning through implementation and delivery

Qualifications:

  • At least 8 years of hands-on experience in architecture, design or development of enterprise data solutions, applications, and integrations
  • Ability to conceptualize and articulate ideas clearly and concisely
  • Demonstrable experience in developing, validating, publishing, maintaining logical and physical data models
  • Excellent communication, presentation and interpersonal skills
  • Hands-on experience with modern enterprise data architectures and data toolsets (ex: data warehouse, data marts, data lake, 3NF and dimensional models, modeling tools, profiling tools)
  • Bachelor or Master degree in STEM majors
  • Strong algorithms, data structures, and coding background with either Java, Python, GO or Scala programming experience
  • Exceptional proficiency in SQL
  • Experience building products using 1 of the each following distributed technologies:
    • Relational Stores (i.e Postgres or MySQL or Oracle) 
    • Columnar or NoSQL Stores (i.e Big Query or Clickhouse or Redis) 
    • Distributed Processing Engines (i.e Apache Spark or Apache Flink or Celery) 
    • Distributed Queues (i.e Apache Kafka)
  • Experience with software engineering standard methodologies (e.g. unit testing, code reviews, design document)
  • Experience working with GCP, Azure, AWS or similar cloud platform technologies a plus
  • Ability to drive change through persuasion and consensus