I am trying to scrape a few things from the website (url in code). I am able to scrape the brand name and the SDR but it seems like anything that's below the SDR, I cannot seem to scrape. I am only testing this on the first result, once i manage to figure it out I will make it dynamic. Hopefully one needs to just have selenium in their project and the chrome deriver and they can then copy/paste this code.
The below code gives the following error:
Exception in thread "main" org.openqa.selenium.TimeoutException: Expected condition failed: waiting for visibility of element located by By.xpath: /html/body/app-root/ecl-app/div[2]/app-search-page/app-search-container/div/div/section/div/app-elec-display-search-result/app-search-result/eui-block-content/div/app-search-result-item[1]/article/div[2]/div/app-elec-display-search-result-parameters/app-search-parameter-item[4]/div[2]/div/div[2]/div/div[1]/span (tried for 10 second(s) with 500 milliseconds interval)
Code:
public void scrape() throws InterruptedException {
System.out.println("Starting Scrape!");
String url = "https://eprel.ec.europa.eu/screen/product/electronicdisplays";
WebDriver driver = new ChromeDriver();
WebDriverWait wait = new WebDriverWait(driver, Duration.ofSeconds(10));
driver.get(url);
driver.manage().window().maximize();
WebElement until = wait.until(ExpectedConditions.presenceOfElementLocated(By.className("eui-block-content__wrapper")));
//The results have been loaded now
//Click on accept cookie page:
new WebDriverWait(driver, Duration.ofSeconds(3
)).until(ExpectedConditions.elementToBeClickable(By.linkText("Accept all cookies"))).click();
String moreButton = "/html/body/app-root/ecl-app/div[2]/app-search-page/app-search-container/div/div/section/div/app-elec-display-search-result/app-search-result/eui-block-content/div/app-search-result-item[1]/article/div[3]/div/a";
String xPathBrandName = "/html/body/app-root/ecl-app/div[2]/app-search-page/app-search-container/div/div/section/div/app-elec-display-search-result/app-search-result/eui-block-content/div/app-search-result-item[1]/article/div[1]/div/div/div[1]/span[1]";
String xPathSDR = "/html/body/app-root/ecl-app/div[2]/app-search-page/app-search-container/div/div/section/div/app-elec-display-search-result/app-search-result/eui-block-content/div/app-search-result-item[1]/article/div[2]/div/app-elec-display-search-result-parameters/app-search-parameter-item[3]/div[1]/div/div[2]/div/div[1]/span";
String energyRatingString = "/html/body/app-root/ecl-app/div[2]/app-search-page/app-search-container/div/div/section/div/app-elec-display-search-result/app-search-result/eui-block-content/div/app-search-result-item[1]/article/div[2]/div/app-elec-display-search-result-parameters/app-search-parameter-item[4]/div[2]/div/div[2]/div/div[1]/span";
//Clicking on more button to load more results to be visible
driver.findElement(By.xpath(moreButton)).click();
WebElement SDR = driver.findElement(By.xpath(xPathSDR));
//Using this logic to scroll to each of the result so it's visible on the web-page
JavascriptExecutor js = (JavascriptExecutor) driver;
js.executeScript("arguments[0].scrollIntoView();", SDR);
WebElement brandName = driver.findElement(By.xpath(xPathBrandName));
WebElement energyRating = wait.until(ExpectedConditions.visibilityOfElementLocated(By.xpath(energyRatingString)));
System.out.println("Brand name: " + brandName.getText());
System.out.println("SDR name: " + SDR.getText());
System.out.println("energyRating: " + energyRating.getText());
}
But switching to replacing
WebElement energyRating = wait.until(ExpectedConditions.visibilityOfElementLocated(By.xpath(energyRatingString)));
to
WebElement energyRating = driver.findElement(By.xpath(energyRatingString ));
Gives the following output:
Starting Scrape!
Brand name: Samsung
SDR name: 63
energyRating:
So i'm so puzzled as to why energyRating is missing and not giving a NoSuchElementException