"You have no doubt seen and been startled
by headlines such as these…
"Christianity Declines Sharply in US,
Agnostics Growing: Pew" – Newsmax
"Christians In U.S. On Decline As Number Of
'Nones' Grows, Survey Finds" – NPR
"Big Drop in Share of Americans Calling
Themselves Christian" - New York Times
"America Is Getting Less Christian and Less
Religious, Study Shows" – HuffPo
With such powerful and respected news
sources proclaiming this sour news, who
could possibly doubt the truth of it?" (Is Biblical Christianity on the Decline?/
By Glenn Stanton-Focus On The Family)
In a time when the church needs to hold to
Biblical truth, and strongly speak out
concerning it, the church is in what I call a "dreadful decline."
We have done what Satan
and the world wants us to do, and that is:
crawl in a corner and say nothing, and do
nothing. We have the responsibility as the
early church did, to strongly influence our
culture. We are to be as Jesus taught in
Matthew 5:13-16 "You are the salt of the
earth...", "You are the light of the world..."
As
the church we are not to let the world affect
us, and dictate to us what is right, what is
truth, and how we are to cater to the wants
and desires of the world. We are to affect
the world with the message of Jesus Christ,
and bring people to Him for salvation.
We
are not to make people comfortable at
church services, but be a hospital for those
who have been beat up by the world. When
the Christian church begins to meet needs,
instead of catering to the wants, that is
when we will stop seeing a dreadful
decline, and we will see an upward display
of spiritual growth and maturity! (GP)
Photo: PublicDomainPictures/18042
images/CCO Creative Commons/Free for
commercial use/No attribution
required/www.pixabay.com