ZPG’s purchase of software company Expert Agent earlier this month has been called in by the Competition and Markets Authority (CMA) and is now under review, prompting shares in ZPG to drop this morning.The CMA has served an initial enforcement order to ZPG under section 72(2) of the Enterprise Act 2002.The announcement was made by ZPG this morning, saying that “having been informed of the review and the order ZPG is now engaging in a consultation with the CMA and will make a further announcement in due course”.The move has been widely anticipated within the industry after ZPG acquired Expert Agent – which trades under the name Websky Ltd – from private equity firm Metropolis. It in turn bought the company off founder Mike Griffiths in 2004.The ZPG acquisition of Expert Agent gave it a substantial slice of the UK agent software market, as it had already bought Property Software Group in April 2016.PSG at the time of acquisition by ZPG claimed to be used by over 40,000 estate agent letting agent branches across the UK, while Expert Agent is used by over 2,500 branches.The CMA has been in the warpath within the property industry of recent, fining four estate agents in Burnham on Sea a total of £370,000 over an illegal price-fixing cartel in the seaside town, and is now offering £100,000 rewards to agents who report similar anti-competitive in their areas.PSG CMA Competition and Markets Authority Expert Agent Zoopla March 31, 2017Nigel LewisWhat’s your opinion? Cancel replyYou must be logged in to post a comment.Please note: This is a site for professional discussion. Comments will carry your full name and company.This site uses Akismet to reduce spam. Learn how your comment data is processed.Related articles Letting agent fined £11,500 over unlicenced rent-to-rent HMO3rd May 2021 BREAKING: Evictions paperwork must now include ‘breathing space’ scheme details30th April 2021 City dwellers most satisfied with where they live30th April 2021 Home » News » Agencies & People » ZPG purchase of Expert Agent to be reviewed by competition watchdog previous nextAgencies & PeopleZPG purchase of Expert Agent to be reviewed by competition watchdogShares in the group drop after announcement first thing this morning.Nigel Lewis31st March 201702,568 Views
1. The total number of indexes
every day staring at the search engine to see how many pages it contains in the end is a matter of effort. In fact, even if your site is doing well, the number of pages that search engines include will often be
produces relatively large fluctuations. In other conditions unchanged, the only reason for the decline in the number of pages included is the drop in site traffic, unless you can find a better reason to prove the page
causes traffic to decrease or nothing to do with both. In other words, if you lose 30% of the page in the Google index, and the reduced pages do not bring traffic to your site,
can prove that the decrease in the number of pages collected is not the reason for the decrease in website traffic.
if your web site is divided into two categories, WWW and no WWW (for example: www.fcjjr.com and fcjjr.com), you need to redirect one of them to another URL (using 301 wrong
to redirect by mistake). Doing so will delay the search traffic in the short run and reduce the number of pages included (especially when the two sites are highly ranked). But in the long run, your site will be
if you know which pages are no longer indexed, you can try to change the navigation structure of the page to make sure that search engine crawlers have more opportunities to find these buried
pages. In addition, check out the pages that are not included, whether they are repeated on other content on the site and similar to those on other sites. This is for some catalog
sites can be difficult because their content is primarily the marketing information of the merchants. In general, unique, user generated content (preferably with multiple internal and external links)
should not be ignored by search engines.
, the search engine support provider Sitemaps.org, gave users another way to grab content from crawlers: insert your XML sitemap feed URL into
robots.txt, the search engine automatically searches for the corresponding address. With Ask.com also joining the search engine support list, XML sitemap is not always
crawlers capture pages that will become crucial.
two, the number of internal links
each engine determines the method of linking is different, I hope every search engine on the number of links fluctuations can be stable and small, is unrealistic.
if you are sure the drop in traffic is inside the site link >