Growing up, I went to church and had always aligned myself with the fundamental beliefs of the Christian theology. We didn't use the term "evangelical." It wasn't a useful term, in that, it added nothing to the belief we had in Christ. Nonetheless, the root word means "good news." And to Christians, of course, the Gospel of Jesus Christ is indeed, good news.
I study my Bible and pray daily. I didn't become aware of the strain of Christianity who referred to themselves as evangelical Christians. Nowadays the word is synonymous with conservative white Republican activist Christian. The definition of the word has changed.
Powerful, wealthy right-wing Christian leaders use this religious ideology to manipulate the emotions, religious inclinations of, mostly, white conservative Christians. In some instances, the wealthy Religious Right leaders gas-light congregants to confuse their outlook by replacing the poor behavior of President Trump with "God's will," which is that it is "God's will" Trump be elected. Far be it from God's will.
The concerns over the term have increased with big-name evangelical Christian support of Donald Trump. Are right-wing words useful? What does it make the world, fellow citizens think about Christians? In my opinion, the politics of the Religious Right has become a thorn in the side of the average believer. Those who identify as Christians but want breathing room between right-wing political activism in the name of Christianity.
For the most part, the white conservative Christians focus on a few cultural issues, that I do not support in the political sense. Namely, opposing gay marriage; working to end abortion; defending gun rights; supporting 'school choice' which is nothing more than self-segregation; far-right religious extremism. Forcing the idea that America is a "Christian Nation" and promoting Christian Nationalism. Which is a way for the Religious Right to maintain we must support a specific interpretation of a particular strain of Christianity. That's not religious freedom. That's fascism.
Evangelicalism has become synonymous with white Christians, which leaves people of color out of the fold. Christianity isn't a white or black religion. Therefore, words that suggest otherwise are, indeed, counterproductive. It's not about political correctness; it's about seeing each other as equal.
Christianity is about following Jesus; you don't need to use the term evangelical for affirming that position. Some Christians refer to themselves as evangelical, orthodox, red-letter Christian or progressive-Christian. All have the same core-principles; a relationship with Jesus Christ. However, the approach to following Christ and serving people varies. But at the end of the day, Christianity's core is Jesus Christ, not the Republican Party as is commonly understood.