What Are Data Team Leaders Looking for During Hiring Interviews? – Built In Austin

Technical expertise is imperative when it comes to hiring data professionals, but soft skills may be just as important. For teams that are scaling quickly, it can sometimes be difficult to identify the traits that will help preserve the integrity of their established culture while rapidly hiring talent.

Russell Foltz-Smith, chief data officer at Smarter Sorting, sees curiosity is a top priority for the product intelligence platform. Working with data requires methodically sifting through clues and piecing together a giant puzzle, which he likens to a forensic exercise. That’s why the hunger to keep going in search of an answer is especially important to Foltz-Smith’s team. 

Trey Perry, a senior software architect at insurance broker Acrisure Technology Group, agrees. “We’re searching for people who are curious, communicative and collaborative,” he said.

Finding the delicate balance of data wizard and culture fit is difficult to do, so Built In Austin asked three local data leaders to share the secret recipe of qualifications they look for in the candidates joining their teams. 

 

Layla Martin

Data Engineering Manager

Arrive Logistics is a freight brokerage.

When it comes to scaling your data team, what are the most important hiring considerations?

It’s no secret that there aren’t clear boundaries across data-related roles in our industry. It’s natural to find people working in the arguably broad disciplines of data engineering, data science, analytics and business intelligence across centralized or decentralized teams within an organization. More recent introductions to the industry are dedicated roles in data governance or security, machine learning and analytics engineering, and MLOps or DataOps. 

The most important initial step is having capable leadership within the company that can define a strategy for how these roles interoperate. Leadership should be well versed in industry trends, both in terms of historical and current disruptions. Their goal is to define strategy and ownership over different components of the data landscape and ensure the team evolves in the right direction. I’ve seen the most value come from personnel with deep knowledge of their discipline who can execute on high-value initiatives; those who can build relationships across disciplines and collectively move things forward; and those with an intellectual curiosity to always question, rethink and improve the existing landscape.

 

On the technical side, what steps have you taken to ensure your tools, systems, processes and workflows are set up to scale successfully alongside your team?

We place a lot of importance on isolating different components of the systems we’re building, such as storage and processing, orchestration, serving, notebooking, and visualization, to pick the best-in-class tooling or infrastructure for each piece. All-in-one solutions may look to solve a wide range of problems quickly, but often these solutions inhibit scale because they tightly couple everything together. Technology and vendors in the data space are changing rapidly and having the flexibility to introduce new, or reevaluate existing, components in our data stack allows us to grow alongside our organization’s needs and keep pace with changes in the industry. As we create new workflows or processes, scale is always a concern. 

Aside from typical planning around how our systems handle increases in data volume, we also think about how our design patterns enable us to move from solely data engineering-owned processes to democratizing access to data and infrastructure by providing self-service tooling for anyone working in the data space.

What’s the most important lesson you’ve learned as you’ve scaled your data team, and how do you continue to apply that lesson?

Frequent and honest communication across engineering, data science and analytics or business intelligence groups is imperative. Increasing our capabilities with data often means increasing contributions across all groups. It’s important to align early on what new skill sets, design patterns, or platform-level initiatives are required for future use cases because hiring the right people or building out robust infrastructure can take time. As our teams are heads-down building solutions to solve our current use cases, we’re also in constant communication about what’s next. We’re sharing articles about how others in the industry are solving similar problems, we’re participating in cross-team demos of new technologies on our radar, and we’re openly discussing pain points and improvements on current systems. Not only does this allow leadership to better plan for scaling teams as a whole, but it also empowers teammates to scale their own skill sets and contributions alongside the organization’s initiatives.

Frequent and honest communication across engineering, data science and analytics or business intelligence groups is imperative.” 

Russell Foltz-Smith

Chief Data Officer

Smarter Sorting is a product intelligence platform.

When it comes to scaling your data team, what are the most important hiring considerations?

One word: curiosity. Data science and data management is a constant forensic exercise. It requires a deep curiosity to keep discovering and innovating.

Data science and data management is a constant forensic exercise. It requires a deep curiosity to keep discovering and innovating.”

 

On the technical side, what steps have you taken to ensure your tools, systems, processes and workflows are set up to scale successfully alongside your team?

We use the cloud from the beginning. It makes everything much easier in the long term. We also iron out open or inspectable development processes early on, and seek first to eliminate all single points of failure early and often.

We plan to rebuild data structures and platforms annually. You learn a lot by rebuilding, especially doing it from scratch.

Lastly, we hire people from a wide range of backgrounds and experience levels, and always hire in pairs.

 

What’s the most important lesson you’ve learned as you’ve scaled your data team, and how do you continue to apply that lesson?

As we grow many opportunities, and many distractions, arise. While it is tempting to pursue all of them, we have learned to say no to projects, people and initiatives that distract us. Instead, we continually reinforce the need to stay laser-focused on developing our expertise and fulfilling the needs of the company.

 

 

Trey Perry

Senior Software Architect

Acrisure Technology Group is an insurance brokerage.

When it comes to scaling your data team, what are the most important hiring considerations?

It goes without saying that ATG maintains a high technical bar. With that said, our team culture is crucial! We want our team to be full of people who are excited about connecting many parts of our vast business and unlocking new doors for others.

Given that we work directly with analysts, engineers, artificial intelligence or machine learning researchers, and others, we look for a combination of analytical skills and emotional intelligence. This can be necessary to dive deep and fully understand the needs of our stakeholders.

We want our team to be full of people who are excited about connecting many parts of our vast business and unlocking new doors for others.”

On the technical side, what steps have you taken to ensure your tools, systems, processes and workflows are set up to scale successfully alongside your team?

We strongly favor data democratization, and we actively look for opportunities to handle repetitive processes through automation.

This philosophy will become increasingly critical as we scale out and introduce more data connections. We must avoid introducing new human bottlenecks and the risk of slowly morphing into a more traditional, or pre-DevOps, operations team.

To that end, we balance foundational investments in our technology with delivering tangible business value. During sprint planning, we’re deliberate about reserving capacity such that we can research new methodologies and tools. When we believe that a tool holds promise, we write code and compare its performance to that of our existing infrastructure. For example, we have an ongoing effort to compare Apache Airflow, Argo Workflows and Kubeflow.
 

What’s the most important lesson you’ve learned as you’ve scaled your data team, and how do you continue to apply that lesson?

Although communication is always important, we work in a rapidly evolving part of the industry during historic times. That has made it especially crucial to communicate effectively and make every effort to keep others informed.

Within the Data Architecture team, we try to broadcast the “why” behind our plans to make sure that members of our team understand the drivers behind our initiatives. Every member of our team has a voice in the conversation, with the ability to influence our direction.

We’re also passionate about stakeholder inclusion and have an open-door policy for our team meetings. When stakeholders can’t be present, we’ve also gotten better at writing down summaries and recording internal events, which helps us maintain cross-functional alignment.

Last, but certainly not least, our team strives to put people first. We understand that these are challenging times, and our continued support for each other is how we’ll get through them.

Spread the love

Leave a Reply

Your email address will not be published.