Empirical Research in Political Sciences

A.Y. 2025/2026
3
Max ECTS
20
Overall hours
SSD
SPS/04
Language
English
Learning objectives
Undefined
Expected learning outcomes
Undefined
Single course

This course cannot be attended as a single course. Please check our list of single courses to find the ones available for enrolment.

Course syllabus and organization

Single session

Responsible
Lesson period
Second trimester
The course will utilize a dedicated website hosted on the Ariel platform, where students will find all essential materials, announcements, and resources. We will also use a Microsoft Teams channel for messaging, updates, and recordings.
In case of any emergency or unforeseen circumstance requiring a shift from in-person to remote instruction, course sessions will continue through the Ariel platform and Teams, ensuring that all students can remain engaged and up-to-date.
Course syllabus
Session 01. Introduction
This opening session introduces students to the course and establishes the foundational questions that will guide our exploration of empirical research methods. After brief introductions, we will engage in a critical debate about the scientific status of political science, exploring key tensions surrounding the nature of valid knowledge. The session concludes with an overview of the course structure, learning objectives, and available resources.
At the end of this session, students will be able to identify foundational epistemological questions in political science, differentiate between paradigmatic approaches to empirical inquiry, and reflect critically on the nature of valid knowledge in the discipline. They will also gain an understanding of the structure and goals of the course, setting the basis for self-directed learning.
Backing materials
Mahoney J, Goertz G. (2006). A tale of two cultures: contrasting quantitative and qualitative research. Political Analysis, 14(3), 227-249. https://doi.org/10.1093/pan/mpj017
Walker, T. C. (2010). The perils of paradigm mentalities: revisiting Kuhn, Lakatos, and Popper. Perspectives on Politics, 8(2), 433-451. https://doi.org/10.1017/S1537592710001180
Van Bouwel, J. (2025). Dimensions of the methodological individualism/holism debate. In Y. Shan (Ed.), History and Philosophy of the Social Sciences. Springer. https://philsci-archive.pitt.edu/23605

Module A. Description
This module introduces the foundational elements of descriptive research in political science. Students will learn how to develop clear conceptual frameworks, create valid measurements, and establish reliable indicators for political phenomena. The emphasis is on building analytical tools for systematic description and comparison in political research.

Session 02. Frameworks
This session examines the essential role of descriptive research in political science. Students will be introduced to the reasons why description is a critical component of research. We will analyze Sartori's warnings about concept misformation and stretching and learn Goertz's framework for understanding concept structure.
At the end of this session, students will understand the central role of description in empirical research and learn to assess the integrity of conceptual frameworks. They will also become familiar with criteria to construct and evaluate social science concepts, developing skills essential for autonomous and cumulative learning.

Core readings
Sartori, G. (1970). Concept misformation in comparative politics. American Political Science Review, 64(4), 1033-1053. https://doi.org/1033-1053. 10.2307/1958356
Goertz, G. (2020). Concept structure: aggregation and substitutability. In Id. Social science concepts and measurement. Princeton University Press, Ch.6.
Further readings
Gerring J. (2012). Mere description. British Journal of Political Science, 42(4), 721-746. https://doi.org/10.1017/S0007123412000130
Adcock, R., & Collier, D. (2001). Measurement validity: a shared standard for qualitative and quantitative research. American Political Science Review, 95(3), 529-546. https://doi.org/10.1017/S0003055401003100
Damonte, A., & Bazzan, G. (2024). Rules as data. Regulation & Governance, 18(3), 657-673. https://doi.org/10.1111/rego.12582

Session 03. Illustrations
Building on the conceptual foundations from Session 02, this session focuses on translating concepts into measurable indicators. Through guided discussions of the assigned readings, student groups will consider the possible shapes of concepts and the consequences of different operationalizations.
Discussion readings
Munck, G. L., & Verkuilen, J. (2002). Conceptualizing and measuring democracy: evaluating alternative indices. Comparative Political Studies, 35(1), 5-34. https://doi.org/10.1177/001041400203500101
Little, A. T., & Meng, A. (2024). Measuring democratic backsliding. PS: Political Science & Politics, 57(2), 149-161. https://doi.org/10.1017/S104909652300063X
Knutsen, C. H., Marquardt, K. L., Seim, B., Coppedge, M., Edgell, A. B., Medzihorsky, J., Lindberg, S. I. (2024). Conceptual and measurement issues in assessing democratic backsliding. PS: Political Science & Politics, 57(2), 162-177. https://doi.org/10.1017/S104909652300077X
Claassen, C., Ackermann, K., Bertsou, E., Borba, L., Carlin, R. E., Cavari, A., Dahlum, S., Gherghina, S., Hawkins, D., Lelkes, Y., Magalhães, P. C., Mattes, R., Meijers, M. J., Neundorf, A., Oross, D., Öztürk, A., Sarsfield, R., Self, D., Stanley, B., Zechmeister, E. J. (2024). Conceptualizing and measuring support for democracy: a new approach. Comparative Political Studies, 58(6), 1171-1198. https://doi.org/10.1177/00104140241259458

Portfolio Deliverable I: Brief
After the session, students choose a political concept of their interest and submit an original brief (max. 750 words, verified references excluded) that:
1. clearly defines the chosen concept based on the literature,
2. deconstructs it into relevant dimensions or attributes,
3. reasons about the relationships linking these attributes,
4. suggests observable features of these attributes,
5. considers the consequences of these choices for entity classification.

Module B. Causation and Explanation
This module explores causal reasoning and explanatory strategies in political science. Students will be introduced to various approaches to causal inference, from philosophical foundations to practical applications, across theory-driven and design-driven strategies.

Session 04. Frameworks
This session establishes the philosophical and methodological foundations of causal inference in political science. Students will be introduced to various conceptions of causation, ranging from Aristotelian principles to mechanisms and Directed Acyclic Graphs, as well as different epistemic strategies for establishing explanatory and causal claims, including inference to the best explanation and counterfactual designs.
Through this session, students will gain an understanding of various conceptions of causality and distinguish between different philosophical foundations of causal inference. They will learn criteria for evaluating explanatory strategies and begin to appreciate the epistemological trade-offs of various causal claims.

Core readings
Holland, P. W. (1986). Statistics and causal inference. Journal of the American Statistical Association, 81(396), 945-960. https://doi.org/10.1080/01621459.1986.10478354
Fearon, J. D. (1991). Counterfactuals and hypothesis testing in political science. World Politics, 43(2), 169-195. https://doi.org/10.2307/2010470
Mackie, J. L. (1965). Causes and conditions. American Philosophical Quarterly, 2(4), 245-264. https://www.jstor.org/stable/20009173
Further readings
Cunningham, S. (2021). Causal Inference: The Mixtape. Yale University Press, Ch.3. https://mixtape.scunning.com/03-directed_acyclical_graphs
Cinelli, C., Forney, A., & Pearl, J. (2024). A crash course in good and bad controls. Sociological Methods & Research, 53(3), 1071-1104. https://doi.org/10.1177/00491241221099552
Ducheyne, S. (2008): J.S. Mill's canons of induction: from true causes to provisional ones, History and Philosophy of Logic, 29:4, 361-376. http://doi.org/10.1080/01445340802164377
Glennan, S., Illari, P. & Weber, E. (2022). Six theses on mechanisms and mechanistic science. Journal for General Philosophy of Science 53, 143-161. https://doi.org/10.1007/s10838-021-09587-x
Lieberson, S. (1991). Small N's and big conclusions: an examination of the reasoning in comparative studies based on a small number of cases. Social Forces, 70(2), 307-320. https://doi.org/10.1093/sf/70.2.307
McGrew, T. (2003). Confirmation, heuristics, and explanatory reasoning. The British Journal for the Philosophy of Science, 54(4), 553-567. http://www.jstor.org/stable/3541678
Moravcsik, J. M. E. (1974). Aristotle on adequate explanations. Synthese, 28(1), 3-17. http://www.jstor.org/stable/20114949

Session 05. Theory-driven strategies
This session examines theory-driven approaches to causal analysis. Through question-guided group presentations and discussions, students will become familiar with the basic methodological choices involved in constructing and applying explanatory typologies, conducting Bayesian process tracing to test causal mechanisms, selecting conditions for Qualitative Comparative Analysis, and testing causal structures.
Through this session, students will refine their capacity to connect the philosophical foundations of causal inference with actual methodologies, appreciate their differences, contrast their strengths and limits, consider their appropriateness for specific 'why questions,' and clearly communicate complex concepts to broader audiences.

Discussion readings
Elman, C. (2005). Explanatory typologies in qualitative studies of international politics. International Organization, 59(2), 293-326. http://www.jstor.org/stable/3877906
Fairfield, Tasha, and Andrew E. Charman. 2017. Explicit Bayesian analysis for Process Tracing: guidelines, opportunities, and caveats. Political Analysis 25(3): 363-80. https://doi.org/10.1017/pan.2017.14
Amenta, E., Poulsen, J. D. (1994). Where to begin: a survey of five approaches to selecting independent variables for Qualitative Comparative Analysis. Sociological Methods & Research, 23(1), 22-53. https://doi.org/10.1177/0049124194023001002

Portfolio Deliverable II: Method Journal Entry
After the session, each student is asked to submit a max. 500-word journal entry -plus verified references - responding to the following:
"This session's discussion readings present distinct theory-driven strategies for causal analysis, each embedding theoretical reasoning into causal analysis in different ways. Please select one strategy and, in a maximum of 500 words, consider:
a) how does it use theory to reveal causation?
b) what assumptions about the relationship between theory and empirics does it require?
c) what kinds of research questions can it address?

Session 06. Design-based strategies
This session explores actual examples of design-based approaches to causal inference in political science. Through guided discussion of the assigned readings, student groups will consider how natural experiments leverage exogenous variation, conjoint survey experiments isolate causal effects, and paired comparisons can be used to test causal claims.
Through this session, students will refine their capacity to connect the philosophical foundations of designs for causal identification with actual methodologies, appreciate their differences, contrast their strengths and limits, and consider their appropriateness for specific 'why questions'.
Discussion readings
Tarrow, S. (2010). The strategy of paired comparison: Toward a theory of practice. Comparative Political Studies, 43(2), 230-259. https://doi.org/10.1177/0010414009350044
Dunning, T. (2008). Improving causal inference: Strengths and limitations of natural experiments. Political Research Quarterly, 61(2), 282-293. https://doi.org/10.1177/1065912907306470
Bansak K, Hainmueller J, Hopkins DJ, Yamamoto T. (2021). Conjoint survey experiments. In: Druckman JN, Green DP, eds. Advances in Experimental Political Science. Cambridge University Press: 19- 41. https://doi.org/10.1017/9781108777919.004

Portfolio Deliverable III: Method Journal Entry
After the session, each student is asked to submit a maximum 500-word journal entry - plus verified references - responding to the following:
"This session's discussion readings present distinct strategies for identifying causal effects, each resting on a different logic for making causal claims. Please select one strategy, concentrate on the logical foundations that make it suitable for causal identification, and consider:
a) what makes it capable of revealing causation rather than mere correlation?
b) on which assumptions does it build?
c) do you find it easy or hard to commit to these assumptions? Why?"

Session 07. Mixing and balancing theory and design
This session explores strategies that integrate theoretical and design-based approaches to achieve more robust causal inference. Through group discussion of the assigned readings, students will become familiar with how directed acyclic graphs and nested analysis can be used to establish causation.
Through this session, students will develop their capacity to integrate philosophical foundations of causal inference and actual methodologies, consider strengths and limits of nesting, consider its appropriateness to specific 'why questions,' and clearly communicate complex concepts to broader audiences.
Discussion readings
Kauffman, C. M. (2012). More than the sum of the parts: nested analysis in action. Qualitative & Multi-Method Research, 10(2). https://doi.org/10.5281/ZENODO.912350
Knox D, Lowe W, Mummolo J. (2020). Administrative records mask racially biased policing. American Political Science Review 114(3):619-637. https://doi.org/10.1017/S0003055420000039
Møller, J., & Skaaning, S. E. (2017). Explanatory typologies as a nested strategy of inquiry: Combining cross-case and within-case analyses. Sociological Methods & Research, 46(4), 1018-1048. https://doi.org/10.1177/0049124115613778

Portfolio Deliverable IV: Brief
After the session, students build on the phenomenon conceptualized in their Deliverable I and submit a maximum 750-word brief - plus verified references - that:
a) asks a causal/explanatory question about the phenomenon
b) selects a design-based strategy and a theory-based strategy, and connects them into a nested research design
c) considers whether this nesting can allow better responses to the driving question than single strategies alone, and why

Module C. Prediction
This module introduces the logic and applications of prediction in political science, distinguishing between predictive aims and explanatory goals. Students will learn how predictive strategies are employed to forecast political phenomena, test theoretical models, and guide decision-making. The module explores both traditional forecasting techniques and modern computational tools, with a focus on their epistemological underpinnings, methodological trade-offs, and relevance to political inquiry.

Session 08. Framework
This session establishes the conceptual foundations of predictive research in political science. Students will examine the distinction between explanatory and predictive goals, the theoretical and practical value of prediction, and the implications of different modeling strategies.
By the end of this session, students will be able to distinguish between explanatory and predictive approaches, understand the conceptual and methodological foundations of prediction, and grasp the different assumptions underlying predictive models in political science.

Core readings
Ray, J. L., and B. Russett. (1996). The future as arbiter of theoretical controversies: predictions, explanations, and the end of the Cold War. British Journal of Political Science 26, no. 4 441-70. http://www.jstor.org/stable/194092.
Shmueli, G. (2010). To explain or to predict? Statistical Science 25(3): 289-310; https://doi.org/10.1214/10-STS330
Further readings
Levi, M. (2004). An analytic narrative approach to puzzles and problems. In I. Shapiro, R. M. Smith, & T. E. Masoud (Eds.), Problems and Methods in the Study of Politics (pp. 201-226). Cambridge University Press. https://doi.org/10.1017/CBO9780511492174.010
Granato, J., Lo, M., & Wong, M. C. S. (2021). Contemporary methodological practices. In Empirical Implications of Theoretical Models in Political Science (pp. 12-25). Cambridge University Press. https://doi.org/10.1017/9781139026819.005
Grimmer, J., Roberts, M. E., & Stewart, B. M. (2021). Machine learning for social science: An agnostic approach. Annual Review of Political Science, 24(1), 395-419. https://doi.org/10.1146/annurev-polisci-053119-015921
de Slegte, J., Van Droogenbroeck, F., Spruyt, B., Verboven, S., & Ginis, V. (2024). The use of Machine Learning methods in political science: an in-depth literature review. Political Studies Review, 0(0). https://doi.org/10.1177/14789299241265084

Session 09. Strategies
Building on the conceptual distinctions from Session 08, this session explores practical strategies for conducting predictive research in political science. Through guided discussion of the assigned readings, students will analyze examples of different forecasting approaches. Emphasis will be placed on understanding the logic behind each strategy, the nature of the data it requires, and the assumptions it embeds about political processes.
Through this session, students will refine their capacity to identify key predictive strategies and assess their suitability for different types of political questions. They will understand the trade-offs between transparency, accuracy, and generalizability and begin to reason critically about the political and normative implications of predictive research.

Discussion readings
Nalepa, Monika. Captured commitments: an analytic narrative of transitions with transitional justice. World Politics 62, no. 2 (2010): 341-80. http://www.jstor.org/stable/40646203
Bechtel, M. M., & Leuffen, D. (2010). Forecasting European Union politics: Real-time forecasts in political time series analysis. European Union Politics, 11(2), 309-327. https://doi.org/10.1177/1465116509360846
Lewis-Beck, M. S., & Tien, C. (2012). Election forecasting for turbulent times. PS: Political Science & Politics, 45(4), 625-629. https://doi.org/10.1017/S1049096512000893
Muchlinski, D. et al. (2021) 'We need to go deeper: measuring electoral violence using convolutional neural networks and social media', Political Science Research and Methods, 9(1), 122-139. https://doi.org/10.1017/psrm.2020.32
Love, G. J., Carlin, R. E., & Singer, M. M. (2025). LASSOing the Governor's Mansion: A Machine-Learning Approach to Forecasting Gubernatorial Elections. PS: Political Science & Politics, 58(2), 226-233. doi:10.1017/S1049096524000866
Törnberg, P. (2023). ChatGPT-4 outperforms experts and crowd workers in annotating political Twitter messages with zero-shot learning. arXiv preprint arXiv:2304.06588. https://doi.org/10.48550/arXiv.2304.06588

Portfolio Deliverable V: Method Journal Entry
After the session, each student is asked to submit a maximum 500-word journal entry - plus verified references - responding to the following:
"This session's discussion readings present distinct approaches to prediction in political science. Please select one strategy, concentrate on the logical foundations that make it suitable for prediction, and consider:
a) What makes this approach capable of predicting political outcomes rather than merely describing patterns?
b) What assumptions does it make about the nature of political processes and their predictability?
c) Do you find it easy or hard to commit to these assumptions? Why?"


Module D. Wrapping up
Session 10. Research Proposal Workshop
This capstone session offers practical guidance for crafting a credible empirical research proposal. Students will learn how to integrate descriptive, explanatory, and/or predictive strategies into a coherent research design. The session will provide step-by-step advice on formulating research questions, justifying methodological choices, and presenting a persuasive case for empirical inquiry. Students will apply these principles in developing their final research proposal sketch.
Through this session, students will formulate a research question and outline a coherent empirical design integrating descriptive, explanatory, and/or predictive strategies. They will make informed methodological choices and justify them clearly, demonstrating readiness for self-directed empirical work and ongoing methodological development.

Deliverable VI: Research Proposal Memo
Building on previous briefs and journal entries, each student will upload a max 1000 word memo - plus verified references - that:
1. Develops her/his own empirical research question, paying special attention to conceptualization issues
2. Outlines a research strategy covered in the course
3. Justifies the consistency of the chosen research strategy with the research question
4. Possibly anticipates findings' limitations

PN: changes to the program may be made to better align with students' progress and engagement.
Prerequisites for admission
This course is designed to be accessible to students with a wide range of backgrounds. No prior coursework in statistics, formal theory, or advanced research methods is required.
However, students should be prepared to engage with methodological readings and to think critically about research design. A basic familiarity with social science research (from prior coursework or experience) will be helpful, but is not assumed.
Support and guidance will be provided throughout the course and during office hours to help all students build the necessary skills.
Teaching methods
The ERiPS course employs a multi-modal, research-driven pedagogy that integrates philosophical inquiry, hands-on application, and peer-based learning. Teaching methods are strategically deployed to foster deep understanding, critical engagement, and transferable methodological competencies. The following modes are recurrently used across sessions, ensuring both variety and coherence in the student learning experience.
1. Instructor-facilitated walkthroughs of concepts and strategies
The course includes structured segments in which the instructor introduces theoretical concepts, explains methodological strategies, and illustrates their application through examples drawn from the literature. These guided walkthroughs ensure progressive understanding and support the development of autonomous and cumulative research skills.
2. Structured group discussions
A substantial component of learning takes place through guided peer discussions based on prompt questions and assigned readings. Small groups are encouraged to compare alternative approaches and develop critical thinking. The instructor facilitates discussion to ensure alignment with the learning objectives.
3. Iterative writing and synthesis
Iterative writing and syntheses: Students progressively develop their methodological reasoning through a sequence of brief assignments, method journal entries, and a final research memo. Each task is designed to foster reflective engagement with core concepts, promote clarity in analytical writing, and cultivate methodological awareness for independent research design.
4. Reflective integration and guided design
The capstone session invites students to integrate the knowledge they have acquired across the course into an original research design sketch. Reflection is scaffolded through prompts that guide students in aligning conceptualization, strategy, and inference. This mode promotes intellectual autonomy, integrative thinking, and readiness for more focused method courses.

ERiPS structure develops students' skills gradually from foundational concepts through analytical options to a research proposal sketch. The readings include canonical texts and cutting-edge research published in top peer-reviewed journals, ensuring exposure to both foundational debates and emerging methodological innovations. Lastly, the oral examination offers an opportunity for individual feedback and grade calibration, emphasizing reflection.
Teaching Resources
In addition to core, discussion, additional readings and slide decks, anyone wishing to improve their familiarity with political science theories and methods, or to approach the course topics from a different perspective, can refer to:
Lowndes, V., D. Marsh, and G. Stoker (eds.). (2018). Theory and Methods in Political Science (Fourth edition). Palgrave.
McNabb, D. E. (2020). Research Methods for Political Science: Quantitative, Qualitative and Mixed-Method Approaches. Routledge.

PN: course materials will be made available through the Ariel website. Please contact the instructor to get access.
Assessment methods and Criteria
Student learning is assessed throughout the entire course via a Portfolio composed of six assignments aligned with the core sessions. Each Portfolio task contributes to the development of skills in empirical research and directly targets specific learning objectives:

Deliverable I. Brief — 7 points
Students choose a political concept, define it based on the literature, deconstruct it into dimensions, and reflect on how measurement choices affect classification. This exercise builds conceptual clarity and precision in empirical description.
Dublin descriptors: 1, 2, 3, 4

Deliverable II. Method Journal Entry — 3 points
Students select one theory-driven causal strategy and critically reflect on its use of theory, assumptions about evidence, and vulnerability to falsification. This fosters epistemological awareness and analytical precision.
Dublin descriptors: 1, 2, 3, 4

Deliverable III. Method Journal Entry — 3 points
Focusing on one design-based causal strategy, students analyze its logic, assumptions, and inferential strength. This encourages critical judgment and methodological reasoning.
Dublin descriptors: 1, 2, 3, 4

Deliverable IV. Brief — 7 points
Students revisit their earlier concept and propose a causal question linked to it. They combine one theory-driven and one design-based strategy into a coherent nested design, justifying the pairing. This exercise integrates learning and advances design competence.
Dublin descriptors: 1, 2, 3, 4, 5

Deliverable V. Method Journal Entry — 3 points
Students select a predictive strategy and assess its logic, assumptions, and applicability to political processes. This fosters critical reflection on methodological trade-offs in predictive research.
Dublin descriptors: 1, 2, 3, 4

Deliverable VI. Research Proposal Memo — 10 points
In this capstone task, students formulate an original empirical research question, justify their conceptualization and chosen strategy, and sketch a coherent empirical design. The memo consolidates descriptive, explanatory, and predictive logics, demonstrating independent judgment, methodological integration, and readiness for delving into specific methods for thesis-level research.
Dublin descriptors: 1, 2, 3, 4, 5

Each deliverable is grounded in lectures, slide decks, readings, and class discussions. Participation in classroom discussions is not mandatory; however, it is strongly encouraged as it helps students complete their assignments far more effectively.

The portfolio-based assessment ensures that students are not evaluated on fragmented skills, but on their ability to synthesize concepts, critically familiarize themselves with methods, and develop original research proposals that reflect real-world research practice. Transparent rubrics will be circulated before discussion classes to enable students to assess their conceptual clarity and methodological awareness.

Lastly, students will refine their deliverables' grades during a final colloquium worth ±10% of the final grade. The colloquium builds on the Portfolio and allows each student to clarify, expand, or revise elements of their submitted work through an individual discussion. This summative assessment supports reflection on methodological decisions, encourages synthesis across assignments, and may result in an adjustment of the final grade by:
+10% if the discussion reveals outstanding reflective insight or significant methodological refinement;
0% if the oral confirms the coherence and quality of the portfolio;
−10% if significant gaps emerge in the student's understanding or ability to articulate core course principles.

The final Italian grade bands will correspond to the following qualitative evaluations:
- lower than 17: Not passing: Major elements are missing or incoherent. Responses are undeveloped, and there is little to no evidence of engagement with course materials. Submission does not meet the expectations of MA-level work.
- between 18 and 20: Perfunctory: Covers the task only at a surface level. Work appears rushed or incomplete, with significant weaknesses in explanation, structure, or use of literature, and minimal critical thinking.
- between 21 and 23: Fair: Meets the basic requirements of the assignment. Key elements are addressed, but with confusion, lack of clarity, or limited engagement with the readings. Shows effort, but with notable gaps in argumentation or structure.
- between 24 and 26: Good - A sound work that covers all required parts. Demonstrates an understanding of the main concepts, although the analysis may appear shallow. Some choices may be weakly justified or underdeveloped, but the overall structure holds.
- between 27 and 29: Very Good - Demonstrates a solid understanding of the task and core ideas. Most components are well done, but depth, justification, or structure may be uneven in places. Shows strong potential but may lack the consistency or precision of excellent work.
- 30: Excellent - Addresses all components with clarity and critical insight. Arguments are coherent and supported with appropriate literature. The structure is logical, and the reasoning is sound. It may have small areas that could be further refined.
- 30 cum laude: Outstanding - Shows deep and original thinking throughout. Conceptual choices are sophisticated, well-justified, and demonstrate full command of the readings. All parts are handled with care and creativity.
SPS/04 - POLITICAL SCIENCE - University credits: 3
Laboratories: 20 hours
Professor: Damonte Alessia
Professor(s)
Reception:
Friday 13:30-14:30 (students) - 14.30-16.30 (thesis students and PhD candidates)
internal building, 2nd floor, room 12 | VirtualOffice channel in Teams