-
Notifications
You must be signed in to change notification settings - Fork 0
Expand file tree
/
Copy pathpresentation.qmd
More file actions
1021 lines (676 loc) · 48 KB
/
presentation.qmd
File metadata and controls
1021 lines (676 loc) · 48 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
---
title: How is Generative AI changing the experience of learning to code?
subtitle: "*Perspectives of undergraduate physics students*"
date: 2025-10-01
date-format: long
format:
revealjs:
logo: assets/UoE_SoPA_logo.png
slide-number: c/t
show-slide-number: all
progress: true
embed-resources: true
footer: "[https://explrncode-2024.github.io/](https://explrncode-2024.github.io/)"
authors:
- name: Dr Joe Marsh Rossney, FHEA
affiliations:
- "UK Centre for Ecology & Hydrology *(current)*"
- "School of Physics & Astronomy, University of Edinburgh *(outgoing)*"
orcid: 0000-0002-8082-8566
bibliography: references.bib
include-in-header:
- text: |
<style>
#title-slide .title {
font-size: 1.5em;
}
</style>
---
##
:::{.fragment .fade-in}
> I think it's making it more enjoyable. It's getting at the part of programming that I enjoy...
:::
:::{.fragment .fade-in}
> when they were explaining to us about the guidelines for using AI... I kind of was like, OK, I'm just not going to use it.
:::
:::{.fragment .fade-in}
> I don't care if this makes me a boomer, I don't think that using AI to learn how to code is a good idea.
:::
:::{.fragment .fade-in}
> Why do I need to know to do this? Because in the future when the AIs get better and better, maybe I'll never actually have to code.
:::
# Introduction
## My CV in brief
- Master in Physics (Univ. Warwick, 2018)
- PhD in Theoretical Physics (Univ. Edinburgh, 2024)
- [PgCAP](https://institute-academic-development.ed.ac.uk/learning-teaching/cpd/postgraduate-certificate) (Univ. Edinburgh, 2024)
- [Research Software Engineer](https://society-rse.org/about/) (UKCEH, 2024-present)
{height=80}
{height=80}
{height=80}
:::{.fragment}
#### Education credentials?
I'm a passionate amateur who has taken available opportunities to develop my practice 'on the side'.
{.absolute right=10 bottom=10 height=140}
:::
:::{.notes}
- PhD research on generative ML
- Techniques similar to diffusion models (less so LLMs)
- Education experience & credentials?
- I am not an experienced or highly credentialled education researcher
- More of a passionate amateaur who has taken available opportunities to develop my practice "on the side"
- I've "minored" in teaching since I left school
- Interest in teaching started while at school
- 2013-18: ad hoc tutoring, teaching support, cover teaching
- Inspired to become a teacher by Ken Robinson's Ted talk "do schools kill creativity" and book
- 2019-24: Lots of TAing for SoPA and EFI
- Completed IntroAP & PgCAP alongside research
- Current job: educational element is still present ("Training" is one of the "four pillars" of RSE) https://arxiv.org/pdf/2002.01035
- PhD supervision opportunities (in future)
- This summer, working with UG intern to develop learning resource: "toybox" of environmental models
- Training African academics in use of land-surface model
:::
## Why am I here?
Last year I led a research project funded by [PTAS](https://institute-academic-development.ed.ac.uk/learning-teaching/funding/funding):
**How is generative AI changing the experience of learning to code?**
:::{.fragment .fade-in}
- Started as coursework for PgCAP _"Researching your Teaching"_
- Focus on UG **physics** students
- Collaboration with two UG students
- Interviews took place last July & September
- Analysis & write-up is... still ongoing
:::
:::{.notes}
Early 2024, coursework for [PgCAP](https://institute-academic-development.ed.ac.uk/learning-teaching/cpd/postgraduate-certificate) _"Researching your Teaching"_ module:
> Write a draft [PTAS](https://institute-academic-development.ed.ac.uk/learning-teaching/funding/funding) application for a piece of research.
- Quite burned out after PhD
- Sandwiched between PhD + job - secretly, running alongside both, because there was only a 1 month gap in which I was working on PgCAP...
- First semester I'd not TA'd since starting my PhD, and missed it.
- I was sick & tired of my PhD and wanted to do something _completely_ different and people-focused for a change.
:::
# Background
## Programming courses in SoPA
- Essentially Python-based
- Main foci: data analysis, modelling & simulation
- In-person workshops, strong TA presence
- Assessment through 'checkpoint' tasks and/or projects
:::{.fragment .fade-in}
Special mention: _Computer Simulation_, developed over **~15 years** by Judy Hardy (Professor of Physics Education).
**A lot of care has gone into these courses.**
:::
:::{.notes}
- Include visualisation in both DA + M&S
- I have TA'd on several of them, but came back multiple times to Comp Sim
- Programming courses were working really well.
- Judy Hardy (Physics Education researcher) developed 'Scientific Computing' and 'Computer Simulation' over a period of ~15 years
- A lot of work and care has gone into the design and implementation of these courses.
- And they were popular! Anecdotally, students enjoyed the courses a lot.
- Workbooks (interactive but printable) contained *a lot* of examples.
- Used to mark the coursework (Solar system) for every student herself - well over 100 projects
- This was a very open-ended project, and some students went very far with it!
- Judy Hardy unfortunately (for me) retired the year before
- I was (and still am) worried about the loss of that accumulated knowledge if GenAI prompts a radical shake-up.
:::
## Course Organisers
Late 2023, I asked course organisers:
> ...tell me about any policy, guidance or messaging aimed at undergraduates about the use of these tools.
:::{.fragment .fade-in-then-out .absolute bottom=50}
(1/4)
> We specifically say "AI tools assistance not allowed" and say that in the same context as plagiarism policy, so consequences are the same as using someone else's work.
:::
:::{.fragment .fade-in-then-out .absolute bottom=50}
(2/4)
> My goal for next year is to completely re-work how the course is taught, and instead allow and explicitly encourage them to use AI throughout.
:::
:::{.fragment .fade-in-then-out .absolute bottom=50}
(3/4)
> I don't know that I will encourage them to use AI, but I will make it clear that they need to use it to help them understand what to do and not simply do it for them.
:::
:::{.fragment .fade-in-then-out .absolute bottom=50}
(4/4)
> What would be interesting is if a chatbot could replace the demonstrators in asking the students various questions about their code.
:::
:::{.notes}
- Asked this while I was developing this idea for PgCAP, without necessarily intending to apply for funding to do the project itself.
- Surprised by the highly varied responses.
- High variance understandable, not a bad thing necessarily since early days, lack of "evidence" https://arxiv.org/pdf/2502.09618
- But students must be so confused!
- Clearly there are value judgements being made here
- But clearly falling back on priors, personal lens
- To move forward and become more coherent, we need to bring in student perspectives
- This is what pushed me to consider actually finding a way to do this research.
- To SoPA: students are confused and you need to get a grip. Let's learn what we can from students.
:::
## Existing research
Quick scan of the literature in January 2024 did not throw up anything that 'seriously' focused on student perspectives towards AI technologies in programming.
Really nothing that ticked all the boxes:
- [ ] Focus on student experiences and perspectives with GenAI coding tools
- [ ] In a physics context (or at least not computer cience)
:::{.aside}
See backup slides!
:::
:::{.notes}
- Could not find high-quality research incorperating student perspectives in literature. The occasional survey.
- Sun: Superficially very similar, but I suspect the "students as test subjects" lens led to a restricted form of analysis, which is evident by the three "themes" identified (I don't think these go into enough depth at all).
- Sheard: Fairly close to what I wanted to do, but focused on instructors
- Mahon:
- Boguslawski: This is the most similar in spirit - the study is exploratory, carefully chosen sample of students and faculty to inform a design process - not an A/B study or something intended to be published.
- However, only 12 students, taking an applied science course, not Physics, interviews conducted in _early_ 2023, and did not cover everything I wanted to cover.
- Something that struck me was the dearth of work on computing/programming in physics education, or stem more broadly.
- Will come back to this because this niggled away at me and distracted me a lot from the work at hand!
:::
## Why do I care about this?
:::{.incremental}
- My own experience with UG programming courses was highly formative: closest I came to experiencing _creative freedom_!
- _For whom_ is GenAI a facilitator of learning, and _for whom_ does it create a diversion from a more valuable learning experience?
- This transitory period is very interesting from a socio-historical perspective.
- We know little about the drivers of attitudes and behaviours towards GenAI - are the students even listening to us?
:::
# Project Overview
## Team
- Me
- Sarah Hogarth (BSc Physics, 2024)
- Naomi Garcia Elizondo (MSc Physics, 2025)
- Ross Galloway (Professor of Physics Education)
- Britton Smith (Chancellor's Fellow, Astrophysics & CO for _Computer Simulation_).
With some valuable support from
- Kristel Torokoff (Senior Lecturer, Physics Education)
## Key aims
:::{.incremental}
1. **Gather evidence to inform educators**: Identify elements of pre-AI courses worth preserving, risks and opportunities.
2. Incorporate a **broad range of perspectives**: **who** feels **what way** about these developments and **why**?
3. **Observe trends** in the perspectives, behaviours, learning orientations of students who (initially) learned to code before/after (with/without) GenAI.
4. **Establish a baseline** against which future studies may be contrasted.
:::
:::{.notes}
- To SoPA: students are confused and you need to get a grip. Let's learn what we can from students.
- The essential aim is to produce primary evidence from students' perspectives that can be used by Course Organisers who are intending to update their programming courses to incorporate limited use of GenAI in a manner that is purposeful, responsible and critical.
- The premise of our study is that the experiences and perspectives of the current cohort of undergraduate students are (perhaps uniquely) rich in information that can help educators to navigate a path towards the explicit inclusion of GenAI in the teaching of introductory programming
- Trends:
- Differences between age/year groups?
- Or differences between students who learned traditionally vs didn't?
- Baseline:
- Never going to get another opportunity to talk to students who first learned traditionally, and had GenAI drop while they were taking university courses. I.e. this is a one-time opportunity to attempt a "benchmark".
- These students are interesting (to me) in their own right
:::
## Interviews
| | Round 1 | Round 2 |
| --- | --- | --- |
| When? | July | September (w1-2) |
| Survey advertisement | Discord | Email/posters/in-person |
| Num. responses | 9 | 160 |
| Num. interviews | 6 | 18 |
| Interviewers | Joe | Joe/Sarah/Naomi |
:::{.notes}
- Jisc survey
- Students promised £20 gift voucher to compensate for 45 minute interview
- Round 1
- 9 3rd/4th years responded to Jisc sign-up survey shared (by Naomi) on student Discord server
- Sign-up survey included questions on preferred pronouns, degree programme, year group, courses taken, use of GenAI
- Fairly good mix of responses across these questions, but strong bias towards competent, interested students with career ambitions
- **6/9** responded to email asking to arrange interview
- Round 2
:::
## Interview structure
- **Semi-structured**, with open-ended questions and space left for follow-up
- Interview guide loosely organised into 3 parts
- Tell me about a coding project that was **fun**
- Tell me about a coding project that was **important** (for your learning)
- Tell me about a coding project where you **used GenAI** (if applicable)
- Also discussed: group work, motivation, future plans, creativity, incorporating GenAI into courses
:::{.notes}
**Why semi-structured interviews?**
- Conceptualised this project as _exploratory_ research
- Assume we don't know a priori the right questions to ask
- Aim for a wide cross-section of students, not a "representative" sample
:::
## Analysis
{.absolute height=500}
:::{.aside}
See backup slides!
:::
# Results
- Reminder: interviews took place in **July/September 2024**
- Caveat: analysis is **not yet finished**
:::{.aside}
More results in backup slides!
:::
## Immediate observations
- Adoption appears high among novice programmers, but mostly confined to free version of Chat-GPT (3.5).
- Adoption more difficult to predict among more experienced programmers: split into using extensively or not at all.
- Identifying archetypes was very challenging: relationship between learning orientation and GenAI use was complicated.
- Noticed differences between yeargroups, but small sample + confounding factors limits conclusions.
:::{.notes}
- Almost everyone was using Chat-GPT free version. Very little copilot. No Cursor.
- Students didn't divide neatly along the lines you might expect.
- E.g. early adopter, but also likes to do things with his hands.
:::
## What are students using GenAI for? {.scrollable}
- Many students report using GenAI as a syntax reference - far more than for code generation.
- Many of these were experienced programmers, who just liked not having to remember all syntax all of the time.
- A couple (again, more experienced) had used GenAI to quickly learn new libraries or languages.
#### Student T
5y, experienced programmer, even before starting UG. Little GenAI use
> So I don't use it very much for programming...
except for trivial lookups...
> I do use it for... like one line things so my most frequent one is probably like how to make a 2D NumPy array that's this many by this many because I just can't remember like all the brackets and things like that. So I'd say, yeah, I use it quite a lot for silly little lines that I could look up if I wanted to, but... it's much easier to get that from AI.
> Whereas I wouldn't use it for like big chunks of code. I think I've tried it a couple of times for that and ... it's just not always right and it's, yeah, it confuses me more than just trying to figure it out myself.
#### Students are routinely going straight to GenAI when standard, non-AI tools could serve them just as well
This makes sense, since they are not taught how to use any editor tools, which have a somewhat high barrier to entry.
However, some students articulate how GenAI is not just more convenient than search, but can be more helpful:
> when I ask it a question, I'm able to get almost sort of contextualised documentation. And that's something I've used it for if I'm learning a new like library and I kind of want to get started quickly, but I hate having to like look through documentation, so I'll just kind of prompt it a bit and I can get started like that.
#### Student H
3y, novice programmer, extensive GenAI user in many areas of life
> oh I don't know what to do, like, what should I do? I'm going to ask the AI, is something that I'm used to. That thought process is a kind of normal for me now.
They want to enjoy programming, but find it hasn't yet "clicked" for them.
> I feel like if I was good I probably would enjoy it, but it's just always something that I feel like I'm... I always feel somewhat behind in it. I feel like I should be able to do more than I can quite often.
They find it stressful to be confronted by problems they cannot solve.
To make matters worse, their experiences asking for help from peers and lecturers has not been good.
> They were just quite, it's written there, why are you asking that question? It's like, oh, no, OK. And then once you have a couple answers like that, you end up just like not really asking anymore.
So Chat-GPT really helps them
> but then, I can ask that question to Chat-GPT... and it can rephrase and it will pick out the just the exact piece that I need.
## Perceived usefulness, motivation to use GenAI {.scrollable}
Most students saw GenAI as **error-prone** on complicated tasks but useful for documentation, debugging and ideation.
Some students have spent time experimenting with how to get the results they want from GenAI, or testing different GenAI tools.
> if I have a problem and I'm wondering if it can do it, I'll spend quite a lot of time rephrasing and seeing what it actually can do, you know, breaking it apart, see if it can do certain bits, stuff like that, just to see the ability that it has at the minute.
Me: _did you ever ask it to just generate some code?_
> I did that probably every time, but it was more to see what it would do... it was still good I think to see how it went about the problems, just to cross check as well for future reference too.
Others report trying, finding it frustrating or not helpful, and quickly giving up on it.
> when it works it's great, but I think most of the time I'm just kind of frustrated and so I move on very quickly from it.
Not knowing how to use it:
> I think right now I struggle to find ways to use it in learning to code
Frustrating experiences have a knock-on effect:
> by like the third change it's already breaking constantly and I need to constantly like look through it and figure out why it's breaking ...
> and it's just like, I could probably have gotten like properly good at using matplotlib pyplot... if it wasn't for the stupid decision to save a bit of time by using ChatGPT
Or just finding low motivation to try:
> I think I'm a bit slow to like, pick up these kind of things, like I'm always a bit, I don't really want to.
- Why are there these differences?
- How can we use these insights to shape practice? Do we want to encourage more tinkering and experimenting?
:::{.notes}
- "if I had a problem..." lower-year student who does not enjoy programming but feels like they should
- "when it works..." Student B (y5), a very experienced programmer who had only tried free Chat-GPT,
- "I think..." Student Q (y4) doesn't have a strong interest in coding, but finds it useful for visualising solutions etc.
-
:::
## Who is not using GenAI? {.scrollable}
#### Student F
- felt like they were worse than their peers in 1st year
- But very motivated & strategic, wanted to level up their skills
- Now have a job in software, low-level
- Felt intimidated by peers, worked alone: might think GenAI appealing!
- But no interest in GenAI!
> So for example, I stick with like my old phone for like years until it's literally like, barely working. And, you know, I see like new features and I'm like, oh, this is like, unnecessary and stuff.
I asked if they might use it in future?
> But, I feel like it's not at the stage where, you know, it can replace human creativity and everything, so, I don't think I would use it for like, actually coming up with new solutions and things like that.
So a mixture of individual traits, low opinion of its abilities.
#### Student V
> And then like for code, I just felt like it was cheating. Erm, (visually contemplating) which I think it... isn't. I also don't really know what to ask it to give me the right things, so I think I'm just not, I'm just not that comfortable with it and so I've never really tried.
> I'm not like completely closed off to it. I just, like I've read that it's quite bad for the environment, is one thing. Like I've seen some stuff about how like chat-GPT uses a lot more energy than just like a plain Google search would and, so I'm like if I can just Google it, why not.
> I've never been like, oh my goodness, like, I just really wish that this were done, like, instantaneously and like I could just, like, get this like robot slave to do it for me.
Ultimately, this student does not feel good about using AI.
Is this being taken into account on the balance sheet? Is it actually worth insisting that a _physics_ student must learn to use GenAI in order to complete a programming course in their physics degree?
#### Hesitations
Student E is a highly experienced programmer
> I think, for a bit of my time I became quite reliant on it. And then I've stopped more recently.
Me: _"is that because you've moved on to CoPilot, or any other reason?"_
> I don't really know why actually. Maybe I've just got more confident in my abilities
Student G is a novice programmer who _both_ feels that GenAI has made them a better programmer, but intentionally stopped using it after realising they were reliant:
> I was using AI for like homework and stuff like that... I shouldn't have done and it made it harder for me to learn how to code.
> But when I was actually like learning how to code outside of class, I wasn't using AI.
> I was using like human resources I had available to me which was helpful. It was better.
Student U (1st year!) is a very strong programmer who frequently takes on advanced projects independently, and has struggled to find peers to work with at their level.
They used GenAI in the past, but...
> I think my thoughts around them \[AI tools\] are you know a little bit tainted by ... well the air that surrounds them, that is, you know, it's like crypto but stupider somehow.
:::{.notes}
- Student F
- I was surprised and challenged by this student
- Expected them to use GenAI either to get ahead of competition, or because they prefer working alone
- If they are an "ownership important" person, expected them to talk about meaningful projects that they did from scratch, but instead talked about projects where they had to edit and extend existing code
- Best guess - a key reason for non-adoption is rooted in a self-confidence, due to having strived and succeeded within any 'short cuts'
:::
## Should it be incorperated into courses? {.scrollable}
Almost all of the students reacted "no" reflexively, or immediately expressed concern.
Some reflected and came round to "maybe, only if done carefully and not from the beginning".
Student A:
> I think the main thing you would have to be careful about is that you're not getting to generative AI to completely solve the problem, because for earlier programming courses where it's very important to develop an intuition, you can just use it to solve the problem without getting any sort of, again, problem solving development yourself.
Student E:
> But for people that maybe had done less in the past, they wouldn't learn anything... Like yeah, AI is great but you still need to know how to code, like fundamentally.
Student N:
> even though there is value in talking through your code, there is a big difference between asking ... and doing it yourself.
> Relying entirely on it is a mistake. Having a course that says, sure, use generative AI, is a mistake.
Student V:
> you shouldn't really be given like integral-calculator.com to learn how to integrate.
but immediately went on to say,
> if it's just become a part of how you program in the modern world, then it should be taught, but not as like a all powerful 'this is gonna solve it for you'
Student I has a different view:
> I think it'd be nice to have at least maybe one or two sessions to go over and be like, OK, this is what you can use Chat-GPT for, and to sort of like not feel like you'd be punished for using it.
Clearly banning it isn't a solution
> one of the problems actually I found with the course last year was it was very 'don't use AI', but you could just ask the AI and it would give you the complete run of code.
:::{.notes}
- Note: bias towards one's own experience being best.
- But we shouldn't ignore students who say "yes, it's useful to me now, but I am glad it wasn't around when I was learning"
:::
## Summary (1/2)
:::{.incremental}
- Students are emphatically not uncritical adopters
- Their attitudes towards GenAI are complex and varied
- Views are volatile, self-questioning
- They are seeking reassurance, guidance and engagement from instructors
- They really want to be included in the conversation, but many of them feel as though it is a confrontation at the moment
- It would be **very hard to extract this level of nuance from a survey**
:::
:::{.notes}
- Grand conclusions:
- People have nuanced, sometimes contradictory views, and complicated reasons for doing things.
- Students are people.
- Who'd have thought? Students have nuanced views!
- They are grappling with this issue themselves
- Students are (as of September 2024) confused. This is unsurprising given that COs are confused.
- We need to work on our messaging, and make sure students understand what they are expected to get out of their education, where its value is
- Reassure them that the skills they are learning are still valuable, and lecturers know what they're doing
- (This requires lecturers to know what they're doing)
- I appreciate that not everyone finds this type of research, into individual, messy people, interesting.
- But I do, and I love how surprising this was to me.
- Of course, it's of limited use to say "treat every student as an individual".
- The aim is still to tease out some insights that can inform practice more broadly.
Anecdotally, many educators assume that students are primarily motivated by getting a good grade, preferably with minimal effort (or, at least, that there are enough of these students that the course needs to be designed to make sure they are "forced" to learn).
Outcome vs process oriented.
However, we found many students who had other motivations to learn to code beyond their grade, and found that the course structure got in the way of their learning in a way they found more useful.
:::
## Summary (2/2)
:::{.incremental}
- Generally, students thought capabilities of GenAI are very limited, but often this was not backed up by their own experience or research (same for me!)
- Students are routinely going straight to GenAI when standard, non-AI tools could serve them just as well (but they don't know about them)
- Undoubtedly, helping students come to an understanding of what GenAI is actually capable (and not capable) of is an important requirement.
:::
:::{.notes}
- So here's hopefully a few ideas that can more directly inform course design
- Most students didn't know how to get good results out of GenAI, how to use coding tools (other than Chat-GPT)
-
- Programming experience does not predict willingness to invest in GenAI tools
- Bear in mind, many young people have digital fatigue, or dislike AI for other reasons - not surprising that students adoption is not skewed towards early adopters, but looks more balanced, with most fairly on the fence, and some laggards
-
:::
# Closing
## What's left for this project?
- Conclude analysis
- Present key results on [project website](https://explrncode-2024.github.io/)
- Publish preprint & submit to journal
- Teaching matters blog
## What's next for me?
- Part of my new role involves training research staff, including postgraduate students: **I am still thinking about this!**
- I ran a [workshop](https://virtual.oxfordabstracts.com/event/75166/submission/107) on AI coding tools at RSE Conference.
- Around 90 attendees! Lots of notes to synthesise...
- Highly mixed picture remains: experience level does not determine use of AI.
*I want to show it's possible to enhance learning and research without over-reliance on big tech and resource-intensive cloud computing.*
:::{.notes}
- In many ways research scientists different from UG students, of course
- But in some ways they are similar
- Many did not learn to code well, or at all, but increasingly need to know how to read and write code
- Lots turning to GenAI, e.g. trying to move from GIS to R/Python
- Key observation for me: many programmers are attempting to leapfrog the 'painful' stage of learning to code using GenAI
- We can go a long way by teaching tools, techniques and principles properly, without AI
- E.g. show people how to use editor features, LSPs, debuggers, navigate documentation, write tests, version control their code etc.
- I am particularly worried about where GenAI is on the enshittification trajectory, and how aggressively tech companies are trying to embed themselves in organisations, including "cash cow" governments and education
- Let's make sure we have an off-ramp!
:::
# Thanks for listening!
**Questions?**
<!-- ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -->
# Backup slides
## Priors & theoretical influences (1/2)
- Student-centred learning [@Dewey1916],[@Rogers1969],[@Entwistle1979]
- Structure vs individual agency
- What structural factors lead to 'wrong' use of GenAI?
- Social & cultural context
- Group working, students as teachers
- Student-teacher relationships (_"feedback without trust is just noise"_)
- Attitudes towards GenAI
:::{.notes}
- Aim to be exploratory, not overly theory-driven.
- Still came with some priors based on my own experience & some background reading.
- Actually got very lost in the weeds of background reading, as did Sarah!
- Student-centred learning
- Dewey - constructivist school, role of teacher as a _guide_, rather than _source_ of learning
- Rogers - learning is only possible when the learner believes the learning will be personally rewarding and meaningful, emotions important
- Entwistle - Learning orientations: _meaning_, _reproducing_ & _strategic_ orientations
- Students can evolve from reproducing/strategic to meaning
- Structure/agency
- Tension between student-centred model, giving students responsibility over their learning, and rejection of "student deficit model"
- There is a balance, what is it?
- Phones in classrooms
- accept that this can be detrimental to learning because they (and the apps on them) are (designed to be) addictive and because there is social pressure to use them.
- We tend not to accept that any student texting during class is fundamentally deficient and that the resolution is for that student to practice being more disciplined.
- The problem with GenAI in education is that it's a huge unknown; nobody understands how it is changing the way we learn, think and work.
- We are all guinea-pigs
- Social context - group learning, peer-peer interactions, prestige, motivation to help others etc
- Cultural context - shared values & beliefs
- Are there things here worth hanging on to that could be threatened by hyper-individualised AI-based learning?
- Attitudes - does this produce division between students who are pro and students who are against?
:::
## Priors & theoretical influences (2/2)
- Student **ownership** in (project-based) learning [@Wiley2009],[@Hanauer2014]
- **Motivation** to learn [@White1971],[@Bandura1977],[@Harter1978]
- Process, & productive (/unproductive/destructive) **struggle** [@English2016],[@Murdoch2020]
:::{.notes}
- Ownership of PBL - me seeking to understand why programming courses felt so good for me
- Note: Project ownership survey did not seem to work too well
- Motivation, especially "effectance motivation", the role of peers & wide social context
- Fell down a bit of a rabbit hole of video game research - what makes games fun? Much overlap with programming.
- Self-efficacy: _"the conviction that one can successfully execute the behaviour required to produce the \[expected\] outcomes"_ (Bandura)
- Previous experiences of mastery are important for self-efficacy
- Productive struggle
- English - the "in-between" of learning
- Prior: prompting GenAI can feel like playing a slot machine (like Dewey's "trial and error experiences")
- extremely low friction (OpenAI's great "success"), and often the regime is one of low probability of a highly successful outcome (AI solves the whole problem for you).
- Ideally, use GenAI to facilitate production of _"reflective experiences"_, but how?
- I like to think of either iterative or recursive experiences
- Iterative: shallow cycling through same strategy
- Recursive: increase in depth at each step
- But, realised I am not qualified to assess psychological goings on, though I can enquire into habits and strategies
:::
## Existing research
Quick scan of the literature in January 2024 did not throw up anything that 'seriously' focused on student perspectives towards AI technologies in programming.
:::{.fragment .fade-in-then-out .absolute top=300}
#### @Sun2024 (Feb)
- Semi-structured interviews with $n=43$ Educational Technology majors (China), following RCT introducing custom GPT into Python assignment
- Three 'themes' identified: (i) _"services offered by ChatGPT"_, (ii) _"stages of ChatGPT usage"_, _"experience with ChatGPT"_
:::
:::{.fragment .fade-in-then-out .absolute top=300}
#### @Sheard2024 (March)
- Semi-structured interviews with $n=12$ CS university instructors (Aus/NZ/Finland)
- _"our goal is to prompt dialogue between researchers and educators"_
:::
:::{.fragment .fade-in-then-out .absolute top=300}
#### @Mahon2024 (July)
> The goal of this paper is to provide an up-to-date picture of the emerging role of GenAI in CS1 courses and offer a detailed, research based roadmap for educators contemplating its integration into their teaching practice.
:::
:::{.fragment .fade-in-then-out .absolute top=300}
#### @Boguslawski2024 (July)
- Semi-structured interviews with $n=12$ students & $n=6$ instructors, applied sciences course (Germany)
- Focus on the relationship between motiviation and LLM use
- Original purpose: inform design of an online learning platform
:::
:::{.fragment .fade-in-then-out .absolute top=300}
Really nothing that ticked all the boxes:
- [ ] Focus on student experiences and perspectives with GenAI coding tools
- [ ] In a physics context (or at least not computer science)
:::
:::{.notes}
- Could not find high-quality research incorperating student perspectives in literature. The occasional survey.
- Sun: Superficially very similar, but I suspect the "students as test subjects" lens led to a restricted form of analysis, which is evident by the three "themes" identified (I don't think these go into enough depth at all).
- Sheard: Fairly close to what I wanted to do, but focused on instructors
- Mahon:
- Boguslawski: This is the most similar in spirit - the study is exploratory, carefully chosen sample of students and faculty to inform a design process - not an A/B study or something intended to be published.
- However, only 12 students, taking an applied science course, not Physics, interviews conducted in _early_ 2023, and did not cover everything I wanted to cover.
- Something that struck me was the dearth of work on computing/programming in physics education, or stem more broadly.
- Will come back to this because this niggled away at me and distracted me a lot from the work at hand!
:::
## Transcription
- All interviews conducted on MS Teams
- Scheduled 45 minutes (-15 +30)
- Edit Teams' automatic transcriptions in-place to correct errors and redact identifiable info
- Download and format to make human-friendly using a Python script^[https://github.com/ExpLrnCode-2024/teams-transcript-formatter]
# Analysis
## First approach (QualCoder)
First round: used FOSS QDA software^[htts://github.com/ccbogel/qualcoder] to mark up transcripts
{.absolute top=170 height=450}
:::{.notes}
I found QualCoder pretty good, but too constraining for the type of analysis I wanted to do
- Good for highlights and brief annotations, but not for extended writing and observations
- Difficult to link annotations/observations - forming groups did not feel right
- Struggled to determine the right hierarchical structure
- Difficult to collaborate with Sarah & Naomi
- One thing I found hard: grouping codes/annotations. Not sure what the right hierarchical structure is, tedious to change.
- Made me think about another area where I ditched hierarchical relationships for links.
- To be fair, I probably didn't try hard enough here - I would never say I've found a better method for QDA!
- However, I found quite a high level of friction, typing into pop-up text boxes etc. I guess I'm just soft & used to nice interfaces.
:::
## Second approach (Obsidian)
Realisation: I am doing an **exploratory** study and I need a tool to help me **explore** the data!
- Switched to linked note-taking app Obsidian^[https://obsidian.md/]
- Just a collection of markdown files - suitable for collaboration via git/GitHub
- As a bonus, get backups and version control!
- Easier to collaborative with Sarah & Naomi
:::{.notes}
- Realised that I should not necessarily be following the standard QDA playbook, since I'm not really trying to do conventional QDA
- I don't need to count occurrences etc
- I really just need to be able to explore the data, make observations, and draw everything together into a coherent analysis
- And ideally do so in collaboration with Sarah & Naomi
:::
## Second approach (Obsidian)

## Case file

:::{.notes}
- Step 1 is to create a case file for each student interviewee
- Record my immediate impressions
- Record impressions upon second listening, as I am pulling out quotes
:::
## Quote file

:::{.notes}
- Pull out quotes that 'stand alone'
- Different to traditional QDA where you focus on annotating small snippets: I really wanted to keep everything in context
:::
## Observation file

:::{.notes}
- This is the most similar to a 'code' or 'annotation' used in QDA
- But phrased as an observation to clearly mark it as "something I observed" rather than just a topic
- Link to quotes, papers, etc
:::
## Theme file

:::{.fragment .absolute left=50 bottom=50}
### Work in progress!
:::
:::{.notes}
- My idea had been to use the graph view to 'determine' the themes based on clusters of observations
- However, this turned out to be difficult in practice
- Makes you think too hard about linking loosely related observations
- Can lead to reverse engineering of links "I need more links here because I know it's important"
- Better to view links as just a tool for navigation and generating surprise as you explore the dataset!
:::
## Background / Literature file

# More Results
## Superficially similar students have very different experiences {.scrollable}
Student B and Student E are superficially similar:
- Both learned to code very young
- Found university courses easy
- Work on coding projects for fun in their free time
Student B uses free Chat-GPT as "contextualised documentation"
> Documentation is usually pretty general online, but when I ask it a question, I'm able to get almost sort of contextualised documentation.
but rarely bothers with code generation
> when it works it's great, but I think most of the time I'm just kind of frustrated and so I move on very quickly from it. I don't continually ask for more help.
when I pointed out that free Chat-GPT was not the cutting edge:
> I think I would struggle to get it to generate more than a few things at once just because, then it would, kind of, I would, I'd have to really like, look at it and then understand it, and I might as well write it myself
Student E started using Chat-GPT early and quickly moved onto CoPilot in VSCode
> it's just helpful, isn't it? Like I don't remember all syntax at all points. It just speeds up the workflow.
Student E challenged my expectation that "creative coders" would be less interested in GenAI
> I like doing things from scratch if I can.
> I just find it inherently quite interesting. I like problem solving and it's... the only place I find myself feeling like actually creative anymore.
## Ownership {.scrollable}
One of my initial research questions: do students view GenAI as a tool or collaborator?
Answer seems to be tool!
> I mean, I just see it more as a tool, right? You're still the controller of this tool, it just makes you more efficient and quick.
This is bolstered by the way students were using GenAI - for relatively simple 'lookup' tasks, or perhaps debugging or simple autocompletion.
Interesting to see if this changes as students start using the 'vibecoding' tools like Cursor, or [Canvas from OpenAI](https://openai.com/index/introducing-canvas/)) which seems explicitly designed as a kind of conversational partner.
### Student C
Me: _Do you think it'd be reasonable to say that ChatGPT is a co-author of your code?_
> Yeah, I would not hesitate to say that.
But they mostly describe it as a tool.
### Student F
Student F (never used GenAI) talks of following an online tutorial
> Yeah so I followed that, but I was making sure to actually like do all the coding myself as well, like along with it. And then I was like, I was like branching off, like doing things a bit my way as well.
Suggests that they do not need to start from a blank slate to feel like a project is theirs.
## Students struggling with time pressures {.scrollable}
Strong message - don't use GenAI as an excuse to give us more assignments with shorter turnaround. We already feel like we're not getting enough time to fully engage with material.
### Student I
Remember Student I, one of the strongest advocates of GenAI. _"How would you sum up your view towards generative AI?"_
> Timesaver. I think that time saving is the most important thing. Because I can do everything that Chat-GPT would tell me to do, but it can- for me it can take up to 10 times as long. And so with lots of other work going on that isn't coding based, like quantum mechanics hand-ins or master's projects, or that senior honours projects going on, to be able to have something that's just: "I've just saved you two hours."
What do we make of this from an educational point of view?
Really begs the question: how much adoption is driven by the desire to carve out more time for other things (e.g. studying for other modules, or just leisure time), rather than students finding it beneficial for their learning.
Programming courses do not have to follow software companies - shouldn't be cramming more into courses to make them more "productive"!
### Student N
Student N's interview lasted nearly 70 minutes. They were very critical about the university, but ultimately in a constructive way.
> in a university physics degree, there is not time for that ... you don't get this chance to practise and try things and experiment and you know, learn in the same way because you have, because ... you're so beholden to the deadlines.
### Student D
Student D is an "early adopter" who used various models, but still liked to tweak and tinker with the output.
> So in the coursework you don't have time to ... pinpoint these mistakes, never understand to prevent a future mistake causing by this. It just stacks up and you'll never know.
### Student Q
Student Q is motivated to improve their programming, but prioritises their mathematics courses:
> that's always my worry with the with coding courses... You kind of don't know what kind of coursework load you're signing up for before you do it.
## Student C {.scrollable}
5th year, little programming experience before starting their degree.
Their account of a 'fun' project was a short (two-evening), self-contained side project that demonstrated a statistical concept.
They enjoy using GenAI when programming:
> it's, helping me in the parts that I don't enjoy, which is going into the minutiae and the excruciating detail of there's this library with these methods
> I'm still having to do the parts that I enjoy, which is, you know, trying to break something apart, try to analyse something, try to figure out what's really happening, translating everything into actual words and code.
However, when asked about an "important project", they talked about a very frustrating experience, an internship where they had to use a large numerical model written by their supervisor.
> having to understand someone else's code that they'd written, and then there was all these things going on. I think that kind of kind of pushed me up...
> So I had to just go through it and understand what it was happening ... and then figure out how to make my own edits so that it would work in the way that I was supposed to be doing.
Me: _were you having fun?_
> Oh fuck no. No, it was real frustrating. It was very tiring.
Me: _why did you not use GenAI to help?_
> It was such a big repository that it just felt, I couldn't, you know, how do you even upload the whole thing?
> And then, you know, another thing is that it belongs to someone else. I wouldn't have felt comfortable doing that.
Despite being highly positive about GenAI for programming, they have _not_ fully embraced it
> I am cautious about using AI outside of that, like it's really only programming that, where I'm comfortable using AI.
For them, programming is not a creative or artistic endeavour, but a tool to get at interesting problems, and GenAI is a tool extension.
> I wouldn't use it for anything that involves like creativity or art.
:::{.notes}
- This was one of the earliest and most challenging interviews for me, because they were the first person I talked to who was incredibly positive about GenAI
- I have come round to finding this a source of optimism
:::
## Student I {.scrollable}
Student I found learning to code at university extremely challenging:
> it was like learning Greek for the first time
They felt intimidated by peers who were already experienced and could complete workshop assignments easily
> Any kind of interaction with peers was incredibly stressful
They decided to _"tactically fail"_ the course (Comp Sim) so that they could spend the whole summer holidays focusing on learning to code.
> I basically gave up on that course by the end of Week 2
They felt instructions were insufficient, and assessed workshops didn't give them enough time.
Unsurprisingly, they are one of the strongest advocates for GenAI as a **time-saver**.
> I can do everything that Chat-GPT would tell me to do, but for me it can take up to 10 times as long.
Despite a difficult start, they find programming rewarding (but do not enjoy the process by itself)
> I think it's just I get a great sense of, like, achievement and accomplishment when I've spent, let's say, three or four hours trying to do something and then, you know, you get so many errors ... because you know, I can't necessarily write code that just works straight out the gate.
> You know, you spend a lot of time doing it and then it _works_.
> And you're like, wow, that's so cool.
They still find it satisfying to debug code with the help of GenAI
> It's nice enough that I can do that \[solve the problem\], but in my mind I'm now at a stage where it's like ehhh (shrugs) I'd rather just be able to go "Mr GPT,what's going wrong?"
> And when I can get it fixed ... it's still satisfying. It's still nice that I've been able to fix the problem, but it just stops me spending hours on it.
However...
> I am like presently aware though that it probably would have made me a worse coder in the long run though, because it was a very good thing for me to have had to go out and actually find stuff first. I think it would have made me too lazy.
They insisted that they made the fastest improvements when working on a group project
> whenever I did overcome that \[social anxiety\] and actually asked peers for help, I did get the most benefit from that.