Jump to content

allu62

Members
  • Posts

    128
  • Joined

  • Last visited

  • Days Won

    2

Everything posted by allu62

  1. I don't think so. The first condition (with !=on) is for http, the second (with =on) for https. Anyway, redirection works (and I hope it will work with .wellknown, too). My question was in fact related to the flags (I do not really understand the explanations given in the Apache manual and using [L], where I now use [R] resulted in a "can't show page, because of to many redirects" error).
  2. Don't understand: text is changed, after I push the Post button. Because it comes from Windows? Sorry! I'll attach the file... htaccess.txt
  3. Something went wrong, when copy/paste .htaccess content. Here, what's really in: RewriteOptions inheritRewriteEngine on RewriteCond %{HTTPS} !=onRewriteCond %{HTTP_HOST} ^streetinfo\.lu$ [OR]RewriteCond %{HTTP_HOST} ^www\.streetinfo\.lu$RewriteCond %{REQUEST_URI} !^/.well-known/ [NC]RewriteRule .* https://www.streetinfo.lu%{REQUEST_URI} [R=301,L]RewriteRule ^index\.php$ https://www.streetinfo.lu/ [R]RewriteRule ^computing/lazarus/index\.php$ https://www.streetinfo.lu/computing/lazarus/ [R] RewriteCond %{HTTPS} =onRewriteCond %{HTTP_HOST} ^streetinfo\.lu$RewriteRule .* https://www.streetinfo.lu%{REQUEST_URI} [R=301,L]RewriteRule ^index\.php$ https://www.streetinfo.lu/ [R]RewriteRule ^computing/lazarus/index\.php$ https://www.streetinfo.lu/computing/lazarus/ [R] Options -Indexes
  4. Sorry, that the last 2 questions weren't the last. The error page is for later, redirection works as I want, but not sure if my .htaccess file content is "as it should be" and I would appreciate if someone, who understands more about this stuff than myself, had a look at it. Redirect all http://www.streetinfo.lu and http://streetinfo.lu URIs (except the .wellknown folder) to https; redirect https://streetinfo.lu to https://www.streetinfo.lu. The index.php redirects are temporarily, just to avoid 404 errors from links, that have not yet be updated, after global changes on the site. Thanks and sorry for bothering you again. RewriteOptions inheritRewriteEngine on RewriteCond %{HTTPS} !=onRewriteCond %{HTTP_HOST} ^streetinfo\.lu$ [OR]RewriteCond %{HTTP_HOST} ^www\.streetinfo\.lu$RewriteCond %{REQUEST_URI} !^/.well-known/ [NC]RewriteRule .* https://www.streetinfo.lu%{REQUEST_URI} [R=301,L]RewriteRule ^index\.php$ https://www.streetinfo.lu/ [R]RewriteRule ^computing/lazarus/index\.php$ https://www.streetinfo.lu/computing/lazarus/ [R] RewriteCond %{HTTPS} =onRewriteCond %{HTTP_HOST} ^streetinfo\.lu$RewriteRule .* https://www.streetinfo.lu%{REQUEST_URI} [R=301,L]RewriteRule ^index\.php$ https://www.streetinfo.lu/ [R]RewriteRule ^computing/lazarus/index\.php$ https://www.streetinfo.lu/computing/lazarus/ [R] Options -Indexes
  5. There is no real need to make access possible to old OS/browsers, except my conviction, that the Internet should be shared resources accessible and usable by everyone (that's why all my programs will always be free of charge and open source, even though with just 0,50€ per download, I could have a comfortable life with several hot meals a week). On the other hand, I came to the same conclusion than you: OS that do not support TLS 1.2 are not far from 20 years old. And another consideration: the choice to make between increasing browsing security for a huge majority of users or not doing so because of probably very rare users, using these "very old" OS. Thus, it's ok for me to follow your advice. Just two last questions: - is there any possibility to detect such not working access and diaplay a personalized message? - is it normal, that the .well-known folder is empty? Thanks a lot for the quick and helpful answers. Support on HelioHost is really great (AAA)!
  6. You never sleep, or how do you do to answer so quickly? What advise would a pro give me? Most sites on the web automatically go https if you type an address in the browser and they are indexed in Google Search with https urls. Mine are all indexed with http. Would a possibility be, to do no rewrite, but adding a https canonical meta tag for each page? Thus everyone should be able to access the site, autoSSL renewal would be automatic and all those coming from a Google link would use https, as I would prefer (not sure, non-pro, who I am)... Thanks...
  7. Hi. Intending to force usage of https when accessing my site, I only noticed now, that SSL seems to be installed and work. Sorry for these "stupid" questions, but I just want to be really sure, that what I intend to do is correct. 1. Certificates and keys have been generated and validated. On the "Manage installed SSL websites" page in C-panel, there is a button to install the certificates on the server. Have I to put this button, or is everything as it should be, nothing to do by me and, when expired, all renews by itself? 2. To force http to https, is adding a rewrite rule to .htaccess the best method? 3. Some older browsers (probably still used by lots of people in some parts of the world) do not support SNI. Does that mean that these people couldn't any longer access the site, if I do the rewrite? How to solve this issue? Thanks for answer. And thanks to the HelioHost team for this nice thing called AutoSSL.
  8. Thanks, I reallay thought it could be sth that does not belong there... Concerning awstats, I use it from cpanel for all my statistics. My question was, if it is possible to access awstats or use the data, it stores in /tmp, to view the data from a link on my site. Happy Easter to the whole team!
  9. Hi. Please, can someone tell me what the /tmp/pma_template_compiles_allu (subdir twig, with lots of names-are-number subdirs) is and how and what for and by whom it was created? Another question concerns awstats. I think that there is no direct possibility to access the statistics from a website link. But, has perhaps someone tried to use the files, stored in the /tmp/awstats dir to display the statistics (would not be real time of course, but better than nothing). Thanks for any suggestion.
  10. There are dozens of sites (SEO and others) recommending to block what they call "bad robots" or even suggest just to allow "good ones"... My awstats from yesterday: - SemrushBot: 18,536+356 hits, 84.77 MB - AhrefsBot: 4,591+261 hits, 46.92 MB - Unknown robot identified by bot\*: 2,908+147 hits, 37.03 MB and similar for other days. In comparison Googlebot: 1,248+853,,70 MB and this only some times a month... Totals for this month: 29,808 pages (428.45 MB) not viewed traffic vs. 630 pages (199.83 MB) normal traffic. Should I really let these crawlers do? And doing so, is that not a senseless "overload" of Tommy?
  11. Hi. Last month, I made a post concerning the diminshing of the visitor number on my site to 1..2 per day during May and first half of June, thinking that there were problems with the server or awstats. I guess, that the statistics were correct and that there was a reason that has nothing to do with Heliohost. Anyway, visist numbers are becoming normal again. But, what could have been the reason? Searching the Internet, I found several articles, telling that excessive site access by bad crawlers may drastically affect the normal site traffic. And effectively, when having a closer look at the Apache log files, there are some robots, that download tons of megabytes from my site. My questions: - Has Heliohost any recommendations, which robots to block? - Has any experienced user such recommendations? Perhaps a list of bad robots? Or, maybe, block all and just allow some known as ok? - What should be blocked in robot.txt and what in .htaccess? - Should some of them be blocked, using IP Bocking in C-Panel? Hope to find some help, because really no real knowledge/experience with these things. Thanks.
  12. Searching the Internet, I found that - Bad website performance may be in relationship with bad crawlers. Are there crawlers, that you should always block? In robots.txt, in .htaccess? (there actually is lots of access by certain robots). - Having too many pages indexed would be bad for performance. With actually nearly 4000 animals in my database and thus 4000 dynamically generated Perl pages, should I use "noindex" to not index all these pages? - Has anyone experienced something similar, I mean a site with increasing performance and than suddenly just a handfull of visitors per day? Thanks for help...
  13. No, I haven't deleted the /tmp folder content. awstats, analogstats and webalizer are all 3 turned on. Perhaps, the number of visitors is really this dramatically fallen. What could be the reason (in Google Search, all is normal; I publish new programs and articles as before...).
  14. Hello. Trying to understand what is happening on my site. Last month pages viewed and downloads were less than half of the months before and this month awstats reports mostly less than 5 visitors a day, often 1 or even 0. I wondered if perhaps Tommy was often down or had other problems. Or if Heliohost blocked me, because download volume was to big (> 1GB in March) or because I did some other thing wrong. After all, 0 and 1 visitor per day, that is not normal to happen. Then, I thought that perhaps there is someting wrong with the statistics applications. In fact, the Apache log file has same size as the months before and the bandwidth application in C-Panel reports 800MB (versus 16MB viewed + 140MB not viewed traffic in awstats). Anyone, who can help? Thanks.
  15. Thanks for having taken all this time for me. Anyway, with or without Google, those, who should read my articles, never will do so. Peope, who say what they think and tell how my country and its citizens really are, are all but not well seen here. Thanks again.
  16. I'm actually spending (loosing?) lots of time to try to understand Google Search. Sorry, for posting all these questions here, but no idea where else to find help. Thanks in advance for any answer and suggestions. 1. Google seems to see my site as 2 different sites, one with "www" and the other without. Thus, pages indexed appear the one belonging to the "www" site, the other not and as result of this nearly as much duplicates as valid pages in Google Search Console. Would creating a sitemap with all "www" entries set all pages canonical with "www" and eliminate the duplicates? Should I use full URLs instead of relative links on my pages? Should I perhaps create a redirect from http://streetinfo.lu/... to http://www.streetinfo.lu/...? 2. My homeless site appears on page 1 of Google search results, but not with the entry page, but a secondary, not often changing page. May this be in relationship that the "www" version of the entry page is considered by Google as a duplicate (the "non-www" having been chosen as canonical)? Should I use an "alternate" in the secondary page metadata pointing to the entry page as canonical? Or would the sitemap perhaps resolve the problem? 3. Similar situation and same questions concerning 2 articles about the same social institution. Google displays the article from last year in search results; the one of this year not appearing there, or more correctly only appears there if further key words have been added (in Google search). 4. My PDF download files are not indexed and result in a "Something went wrong" when I try to index them in Google Search Console. May this be due to the fact, that they don't contain metadata? Hoping that someone understands more about these things than I do. And has the time to help. Thanks.
  17. Hi. Can anyone please tell me how to do to define aliases like the following (Apache httpd.conf on my Windows 10): Alias /animaldb "C:/Programs/Apache24/htdocs/computing/website/animaldb" corresponding to the links: /animaldb/ => /computing/website/animaldb/ There is an Alias item in CPanel, but as I understood this is intended for domain aliasing rather than for directories. I actually use Redirect, what works well, but the redirected pages are excluded in Google Search, because of being redirects. Thanks for suggestions.
  18. Problem solved. GD and other dependencies are installed and simply copying the Chart folder with all modules to my application directory works fine (or should I copy such modules to /home/allu/perl?). Everything working as before again - just faster! Thanks guys!
  19. Hi. I used the Perl Chart module (https://metacpan.org/pod/distribution/Chart/Chart.pod) on Tommy before the disk crash. Is there any possibility to install it on the actual system? Thanks.
  20. Now it works - thanks a lot. Curiosity question: On old Tommy my directory listing script took up to 15 secs or even more. On new Tommy the listing is displayed without any delay. Where does this gain of speed come from?
  21. Thanks. Perl database scripts now ok. Chart not yet working, however... For the PHP problem, I actually use a work-around by considering the file-extension instead of the mime-content: this is ok for me, as I have only some types of files...
  22. Thanks for answering. I recreated the database all manually. No deal, Tommy is really fast... PHP is inherited version PHP 7.2 (ea-php72) Chart doesn't work. Nor do my other Perl scripts. Get MySQL error (probably also the reason that Chart does not work): install_driver(mysql) failed: Can't locate DBD/mysql.pm in @INC (@INC contains: "." "." /usr/local/lib64/perl5 /usr/local/share/perl5 /usr/lib64/perl5/vendor_perl /usr/share/perl5/vendor_perl /usr/lib64/perl5 /usr/share/perl5 .) at (eval 5) line 3. Perhaps the DBD::mysql perl module hasn't been fully installed, or perhaps the capitalisation of 'mysql' isn't right. Available drivers: DBM, ExampleP, File, Gofer, Mem, Proxy, SQLite, Sponge.
  23. Thanks for being moved! Is it possible that good old Tommy is fast, fast? Some problems/questions: 1. Is it likely that MySQL database restore works via C-Panel (on Ricky I always failed) or do you suggest that I restore all manually? 2. PHP script says "undefined function mime_content_type()". I read in some blogs that this function is missing in some installations; has this changed from "old Tommy PHP"? 3. Is there any possibility to install the Perl Chart module (https://metacpan.org/pod/distribution/Chart/Chart.pod); the admin had installed it serverwide on Tommy before, when I asked how I could use it; locally would be fine, too. If not possible, I could try to just copy the module and all dependencies (GD, etc) to my cgi-bin... Thanks...
  24. allu62

    Tommy Is Back!

    Thanks a lot. Can't await it to be on old good (new) Tommy again!
×
×
  • Create New...