TA Testing thoughts?

Due to recent events in the local job market, I’ve had to rush - open a bunch of positions a bit ahead of some changes I’m trying to make in our hiring process.

One of the things I’ve been thinking a lot about is how to standardize the process a bit more as a way of managing unconscious biases or unintended barriers to new folks. In the past I’ve been super picky about hires but as the department grows – and as it gets less homogenous – I feel like I’d need to provide at least a backbone of standardized tests so I can get a good baseline of comparisons.

I’d love to hear people’s thoughts on testing in TA hirign, pro- and con- . Anecdata is totally appropriate :slight_smile:

Not a fan of tests, but at Virtuos we had a test split into 4 areas, which was mostly intended for (very) junior positions - realtime rendering, 3D production / DCC tools and some art questions, general IT / Windows knowledge, and a coding section.

Those covered the areas where we would employ TAs - work in an engine with shaders, lighting, etc.; work in a 3D application like Maya, Max, Designer, etc.; day-to-day troubleshooting with Windows, P4, plugin install, etc.; and finally, writing code.

The test was about 4 pages - you could answer it in 30 - 40 minutes if you knew your stuff. We didn’t really want to waste people’s time by giving them tasks that would keep them busy for days - personally, I feel long tests aren’t very professional, and you as an employer should take other peoples’ time as serious as your own. Anyhow…

People straight out of college usually could answer it on their own. They could google if they wanted. But even with that some people just couldn’t get the right answers or explain how they came up with them :wink: But that’s okay - in the real world we also often rely on info from others, such as API docs, engine docs, the client’s confluence, Stackoverflow or Google. But we must understand what the info we find means and how it applies to the problem.

The format was a mix of multiple-choice, fill in the blanks and open answers. We would expect that answers would reflect the experience from candidates’ interviews or CVs. Generally we administered the test before the interview. Sometimes afterwards, if we felt the candidate didn’t do well at the interview due to personal reasons such language barrier, anxiety, etc.

The test would tell us: if the candidate knew the basics, if he could reason, if he could express himself in writing, if he was more of a generalist or specialist.

The idea was that getting all answers right wasn’t the point. The idea was that the candidate should give solid, well explained and reasoned answers for the questions in his area of expertise. We sorted out people where the expertise claimed didn’t match with the test results, or those who would not explain their reasoning. Obvious copy & paste jobs would also get tossed.

The test, CV and portfolio would then be the basis for the interview.

Another thing I recommend is getting a set of interview quesions ready. Not necessarily a script, but a guide that makes sure you cover certain topic areas during the interview so you can compare candidates better. I usually try to tick the following:

  • past technical experience; technical challenges and how you overcame them and what you learned. Type of tech worked with. Scope worked with (e.g. did you work on small scripts or large pipeline code-bases?)
  • past work experience; does the setting compare to our setting? e.g. team sizes, relation to superiors and other teams, process in managing work, tool dev, and day-to-day work.
  • interpresonal: how do you work with individual artists? Do you have a process when helping them? (basically, we want a bit of a customer-focus mindset with TAs, because unless they are pure content creators, their main job is support and helping others!)
  • creativity, curiosity: if you could make your own game or tool, what would it be? why this? how would you start planning or coding it? what tech would you choose? what risks do you think there are?
  • learning. How do you learn? Have you trained others? How wow would you go about learning something new?

At some points we did have a Unity and Unreal based take-home test, but I was never comfortable with it. It felt artificial and narrow in the topics it touched. We only gave this a few times to senior candidates. The other drawback was of keeping the test relevant, so we had to plan TA time to regularly review and update it.

I think the main problem with focused tests that only touch on a few topics is that they introduce another bias - people who don’t have the skills at the moment where the test is administered, but who are perfectly able to acquire them in no time (after all, being a TA is about constant learning!) can be caught off guard and are at a disadvantage. Unless your test is so easy and general that everyone can master it, but then it would be pointless… or if you’re okay saying “well, if you need a week to learn stuff for this test, then do it…”, but I hate wasting peoples’ time, especially if they have a good folio and resume.

We generally don’t do any kind of tests for TAs here.
The programming team does a a fairly simple test during their interviews, and I had the fun of being the first and only TA subjected to it during my interview.

It was a nice simple self contained problem that took less than 30 minutes, and was really designed just to get a feel for how the taker thinks. I was encouraged to talk out my thoughts and process, and even given a few hints when I missed a potential edge case.

I found the experience utterly nerve-wracking, but still approachable, however I don’t know if it would have been as valuable of an interview thing if it wasn’t done on-site as part of a conversation, if that makes sense.

I think the first place to start is understanding the role you’re testing for and designing tests around that. If I’ve got a position that’s going to be doing a lot of data processing and performant, I’d tailor a test toward that. If the role required using an old, poorly documented codebase, I’d ship them something they’d have to really dig into the code for to figure out how to use it.

I’ve seen a few resumes come across my desk recently with links to github accounts instead of, or in addition to portfolios. I like the idea, I think, because it gives me a sense of someone’s coding skills and thought process without necessarily the heavy overhead of The Test. The downside though is that for a lot of folks that can be more of a sketchpad, or a blog, instead of a proper portfolio piece, so it would be hard to sort the wheat from the chaff until we can standardize that kind of portfolio piece.

I think, though, what I want to start doing for TA/Python tests is throwing someone at the Advent of Code. They’re part word problem, part architecture challenge, with a bit of a need to write performant code, and don’t require any additional infrastructure from me that they’d need to work with. The problems are known, and as with so many things in Technical Art, I’m just really interested in how people solve them.

This came in via PM from someone who asked to have it reposted:

Here are things I learned or was reaffirmed in certain practices I already use:

  1. Know the job role and its responsibilities. Have it clearly explained. Make sure you understand how the role is supposed to work within your team.
  2. Get rid of the word “fit” and define specific attributes that are the DNA of a TA. Also consider what attributes are needed to fulfill the role. By attributes I mean “analytic”, “motivated”, “determined”, etc. As you interview, you’ll need to ask questions that reveal those attributes in the candidate and determine what competency level they have.
  3. Don’t ask stupid questions that don’t have anything to do with the interview or are derived from some brain teaser. Also don’t bring someone in just to talk at them for the entire interview time.
  4. If interviewing a candidate with a group of 3 or 4 interviewers, each interviewer needs an objective of what type of attributes they’re trying to discover and at what level of competency. For instance, perhaps you want to know how adaptive a candidate is to new situations. One question may be “tell me about a skill that was self taught in order to overcome a production or workflow issue?” From here, you can dig deeper to figure out what motivates them as a TA. Other interviewer roles include: finding out if the candidate can fulfill the requirements of a full workday, when they can start, and dividing attribute questions upon interviewers.
  5. You can do tests, but we’ve only done them with junior candidates. Mid and senior level candidates should have demoreels with code examples upon request. Interview questions should provide enough feedback for the rest of the info needed
1 Like

Here are a few of my thoughts that I haven’t seen yet. Above responses are great.

  1. don’t waste peoples time. This means planning everything out a head of time and know which areas/topics you want to interview. What questions you are going to ask. Don’t try and wing it day of. Know exactly what role you need to fill and make it clear to them. Do phone screens before bring them in. …etc

  2. Be Consistent. Ask the same questions to multiple candidates so you can correctly compare people. Try not to bias their answers. Make a way to “grade” them on various topics. Take notes as you interview that you can look back onto.

  3. Be respectful and appreciative. thank the person for taking the time. Let them know when they can expect to hear back. If they aren’t hired, let them know what they can focus on before they apply again.

  4. This is a two way conversation. Be sure to let them speak, stop them if they ramble, and allow them to ask questions as well. They also need to know your company is a good fit for them. If you can’t answer their question directly for various reasons, ensure it gets answered by the appropriate person before they leave.

Not a huge fan of the test. I have given it once or twice to rigging TAs. I look more for ability to plan their creative process instead of technical correct efficient answers. Hard to make a question that demonstrates what is useful to TAs since our job changes so much and covers a wide band of topics and social interactions.

1 Like

Does anybody like whiteboard coding?

I don’t. But it’s a thing. Any defenders?

I don’t think I’ve ever seen a single person say nice things about whiteboard coding.
I’ve seen quite a few grudging “Well that’s just how some places are” style descriptions, but never anything positive about the experience.

If you’re going to have someone write code, give them a keyboard, unless the job actually involved writing code on a whiteboard, why do it in the interview?

Though depending on the general state and scale of the project, it might be more interesting to see how well they read code, not just their ability to write it.

The thing with whiteboard coding is that it’s often a test of how candidates tackle coming up with an algorithm. I think, if you plan to have TAs write small, specialized code, then this might be a good test. For example, shader effects, methods performing specific actions or processing of 3d data, such as polygons, vertices, bone chains, etc.

Generally you want to see how candiates plan ahead, what their thinking is, and then explain advantages and drawbacks of their solution. E.g. runtime performance, scalability, predictability.

This can be an okay test for programmers who will work on a specialized sub-set of the larger code base. It’s a bad test if you want to compare tech-artists across the board. Some tech artists may not even know how to code. (I actually had a great guy in my team who’s knew a lot about real-time rendering, blueprints and UE4’s materials, but he couldn’t really code at all!)

The thing is, with tech-art, runtime performance and scalability isn’t really that important in many cases. We’re not optimizing the majority tools like we optimize a 60fps game. Neither do our pipelines need to scale up like Google’s services. And half of the code may even be considered throw-away.

Here is what I consider important:

  • being able to work with other peoples’ code - R.White’s suggestion about reading code is a great one!
  • being able to plan ahead - thinking of how you will structure your program, how you separate concerns, how you go about making your code maintainable and expandable
  • considering usability - what separates TAs from programmers is that they should understand how artists work and what they value in “artist friendly tools”. If a TA cannot think about this, why do I need to hire him? I could just borrow a random coder from the programming team instead. In practice we often mock-up tools’ UIs - sometimes it’s just sketches on paper. A candidate could do this on a whiteboard.
  • ability to leverage the standard library, 3rd party frameworks and systems to get the job done. E.g. I might be more forgiving for not chosing the most efficient data structure over not using os.path to work with paths or not using other built-ins which’s use can add much to readability and maintainability.
  • think about error & exception handling and formatting. Good tools and pipelines are resilient and handle errors gracefully.
  • understanding relationships between data. This could also be a good whiteboard exercise. Especially for tools that operate on larger data sets (files, directories - or a diverse range of objects in a scene) it can be helpful to analyze their relationship, understand the attributes you need for processing, and thinking how those can be managed in your code. This could be captured in a diagram (something like a simple ERD) and then based on this, data-structures can be chosen.