Folgen
Aleksandar Petrov
Aleksandar Petrov
Doctoral student, University of Oxford
Bestätigte E-Mail-Adresse bei robots.ox.ac.uk - Startseite
Titel
Zitiert von
Zitiert von
Jahr
Language Model Tokenizers Introduce Unfairness Between Languages
A Petrov, E La Malfa, PHS Torr, A Bibi
Conference on Neural Information Processing Systems (NeurIPS) 2023, 2023
242023
Integrated Benchmarking and Design for Reproducible and Accessible Evaluation of Robotic Agents
J Tani, AF Daniele, G Bernasconi, A Camus, A Petrov, A Courchesne, ...
2020 IEEE International Conference on Intelligent Robots and Systems (IROS), 2020
122020
Learning Camera Miscalibration Detection
A Cramariuc, A Petrov, R Suri, M Mittal, R Siegwart, C Cadena
2020 IEEE International Conference on Robotics and Automation (ICRA), 4997-5003, 2020
92020
When Do Prompting and Prefix-Tuning Work? A Theory of Capabilities and Limitations
A Petrov, PHS Torr, A Bibi
International Conference on Learning Representations (ICLR 2024), 2023
62023
HiddenGems: Efficient safety boundary detection with active learning
A Petrov, C Fang, KM Pham, YH Eng, JGM Fu, SD Pendleton
2022 IEEE International Conference on Intelligent Robots and Systems (IROS …, 2022
22022
Robustness of Unsupervised Representation Learning without Labels
A Petrov, M Kwiatkowska
arXiv preprint arXiv:2210.04076, 2022
22022
Language Models as a Service: Overview of a New Paradigm and its Challenges
E La Malfa, A Petrov, S Frieder, C Weinhuber, R Burnell, R Nazar, ...
arXiv e-prints, arXiv: 2309.16573, 2023
1*2023
Certifying Ensembles: A General Certification Theory with S-Lipschitzness
A Petrov, F Eiras, A Sanyal, PHS Torr, A Bibi
International Conference on Machine Learning (ICML) 2023, 2023
12023
Search Algorithms and Safety Verification for Compliant Domain Volumes
SD Pendleton, AP Petrov, CTY Fang, MK Pham, JGM Fu, YH Eng
US Patent App. 17/941,712, 2023
12023
Optimizing Multi-rendezvous Spacecraft Trajectories: Matrices and Sequence Selection
A Petrov, R Noomen
arXiv preprint arXiv:2011.06617, 2020
12020
Prompting a Pretrained Transformer Can Be a Universal Approximator
A Petrov, PHS Torr, A Bibi
arXiv preprint arXiv:2402.14753, 2024
2024
Compositional Computational Systems
A Petrov
ETH Zurich, 2020
2020
Das System kann den Vorgang jetzt nicht ausführen. Versuchen Sie es später erneut.
Artikel 1–12