Skip to content

Selenium returns NoSuchElementException Error

I`m a newbie to python. Recently I got interested in Web Crawling.

Today I got stuck in NoSuchElementException

This is the webpage that i want to scrape.

enter image description here

When I click the username that i erased, it returns box like this.

enter image description here

Though I used the xpath that i copied from Chrome developer tool, it returns me NoSuchElementException:

Message: no such element: Unable to locate element: {"method":"xpath","selector":"//*[@id="main-area"]/div[4]/table/tbody/tr[1]/td[2]/div/table/tbody/tr/td/a"}
  (Session info: chrome=87.0.4280.88)

HTML is like this

<a href="#" class="m-tcol-c" onclick="ui(event, 'royaltina',3,'이주연마인','25868806','me', 'false', 'true', 'schoolch', 'false', '5'); return false;">이주연마인</a>

My code is just like this,


I checked there is this xpath, but when I get it into .find_element_by_xpath() method it returns Error.

I do really share the webpage, but it needs to log-in to get there, So i cannot share the webpage.

Could you guess what might cause this problem?

I checked time is not the problem. I checked iframe is not the problem.

Thank you in advance Have a great day!



To locate the element with text as 이주연마인 you need to induce WebDriverWait for the element_to_be_clickable() and you can use either of the following Locator Strategies:

  • Using LINK_TEXT:

    element = WebDriverWait(driver, 20).until(EC.element_to_be_clickable((By.LINK_TEXT, "이주연마인")))

    element = WebDriverWait(driver, 20).until(EC.element_to_be_clickable((By.CSS_SELECTOR, "a.m-tcol-c[onclick*='royaltina']")))
  • Using XPATH:

    element = WebDriverWait(driver, 20).until(EC.element_to_be_clickable((By.XPATH, "//a[@class='m-tcol-c' and contains(@onclick, 'royaltina')][text()='이주연마인']")))
  • Note: You have to add the following imports :

    from import WebDriverWait
    from import By
    from import expected_conditions as EC


You can find a couple of relevant discussions on NoSuchElementException in: