As Manjur said, nowadays the best option is Deep learning. The most common is BERT, Pre-training of Deep Bidirectional Transformers for Language Understanding and variants. These are variants of Transformers encoder, which is better for extracting information than the decoder, like GPTs.
These models use fine-tuning for tasks like NER. In fact, they are already finetuned versions for Localizations and Persons. You can find implementations in Spacy, for example. But if you want better accuracy and you have experience with pytorch or tensorflow, preprocessing corpora, and you have labelled data to use (They also exist datasets only with these entities), then you can fine tune it yourself.
Libraires: https://medium.com/@b.terryjack/nlp-pretrained-named-entity-recognition-7caa5cd28d7b#:~:text=
Different approaches and SOTA: https://primer.ai/blog/a-new-state-of-the-art-for-named-entity-recognition/
The whole implementation of NER with BERT using a CSV (It's from Kaggle so you can also download the data set): https://www.kaggle.com/abhishek/entity-extraction-model-using-bert-pytorch