Clone magento another hosting domainMagento is one of the fastest growing eCommerce platforms on the internet and with its recent acquisition by eBay it promises to stay on the lead and keep growing with even faster pace than before.  It’s been choice for a platform for many of my clients and recently I had to do some development work on one of the websites. The shop was running Magento 1.7 and was on a dedicated CentOS server.

Move Magento to different domain/subdomain


Tagged with: , ,

Reset magento admin password with SQLIf you forget your admin password for Magento and you can’t remember the email or just want quick fix you can use one line of SQL to sort that issue out.
[cc lang=”sql”]UPDATE admin_user SET password=CONCAT(MD5(‘qXpassword’), ‘:qX’) WHERE username=’admin’;[/cc]
All you have to do is replace the “password” with your required one and run the query.

Tagged with: , ,

Point root domain to s subfolder with .htaccessIf you are running your site on a shared hosting and you have multiple domain installed as subdirectories the things could go out of hand really fast. In my specific case I have multiple installs of different CMS and one main domain running from “root” directory. What I wanted was to be able to run all of my domain in separate sub folders. Here is the solution:
[cc lang=”text”]
RewriteEngine on
# Part 1
RewriteCond %{HTTP_HOST} ^(www.)?$ [NC]
RewriteCond %{REQUEST_URI} !^/topdomain/
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^(.*)$ /topdomain/$1
# Part 2
RewriteCond %{HTTP_HOST} ^(www.)?$ [NC]
RewriteRule ^(/)?$ topdomain/index.php [L]
In first part I detect the domain, check if the doesn’t match our subdirectory, make sure that the URL doesn’t match any existing file or directory and last serve whole request from our subdirectory passing all other request parameters that may come with the original URL.

The second part serves any request to our top level domain that has no parameters to the index files in our designated sub directory this is needed because that rule would be skipped in first conditions because our root is existing directory.

So just replace the domain and the subdirectories and you are ready to go.

Tagged with: , ,

PHP STDIN stram files from CLIIn console environment PHP could be very useful especially running long tasks or for CRON jobs.

You can execute your script like:
[cc lang=”text”]php scriptName.php[/cc]
You can also pass some parameters like:
[cc lang=”text”]php scriptName.php param1 param2[/cc]
Those can be retrieved through a PHP variable which accessible only in console mode called $argv
[cc lang=”php”]
$param1 = $argv[1];
$param2 = $argv[2];
There is also another small trick that can parse the arguments into $_GET global array
[cc lang=”php”]parse_str(implode(‘&’, array_slice($argv, 1)), $_GET);[/cc]
That all looks very good but imagine you want to pass a file or even better a stream from the console that won’t be received as normal parameter though there is a way to get the data from the incoming stream which is very similar to reading files in PHP:
[cc lang=”php”]
$in = fopen(‘php://stdin’, ‘r’);
    $text = $text . fgets($in, 4096);
This block of code will read all incoming data the STDIN stream and append it to the $text variable. So how can you use it in the CLI here is an example
[cc lang=”text”]
php scriptName.php < fileData.txt cat someFile | php scriptName.php [/cc] Imagine you have a CSV file and you want to pass it to your script so that's how to do it simple and clean.

Move PrestaShop to new server or domainIf you need to migrate a PrestaShop website from a local to a remote installation or just change the domain name, you need to follow these steps:

0. For security reasons, make a backup of all files and database

1. Upload all files to new hosting via FTP

2. Import the database using the phpMyAdmin utility or through terminal

3. On database we need to open  ps_configuration table and edit the following lines: PS_SHOP_DOMAIN and PS_SHOP_DOMAIN_SSL and we replace the current domain, with the new one

4. Update ps_shop_url to match the new path and domain
[cc lang=”sql”]UPDATE `ps_shop_url` SET `domain` = “”, `domain_ssl` = “”, `physical_uri` = “/”;[/cc]
If your PrestaShop is under a subdirectory of the main website you have to set physical_uri to “/shop/” or similar.

5. For last we have to edit the file inside the config folder and change the values for database name  _DB_NAME_, user _DB_USER_ and password: _DB_PASSWD_

6. Copy across your COOKIE hashes as well: _COOKIE_KEY__COOKIE_IV_, _RIJNDAEL_KEY_, _RIJNDAEL_IV_  otherwise the user won’t be able to login including the admin user of PrestaShop

In you have the following fields:
[cc lang=”php”]
define(‘_COOKIE_KEY_’, ‘wjawOHv6ZUKlo0Ewy8Qr3sjYYBAdgcfekAbU3bgAcJgHz3mETtyggtjo’);
define(‘_COOKIE_IV_’, ‘a2LWT1T4’);
define(‘_RIJNDAEL_KEY_’, ‘Le4Lkak7GbkTxeK5HLzchFIHx9xQQQ3WF’);
define(‘_RIJNDAEL_IV_’, ‘1p+Cc0gcZOKtKR9ozrBRyg==’);
Make sure that all these fields have been copied across to the new installation and the values are exactly the same. The main reason is that COOKIES and PASSWORDS get encoded with those values and if they don’t match the user won’t be able to login.

That’s it, you only need to check  .htaccess file permissions (777) and all should work correctly!

Tagged with: , ,

HTTP Cache control and PHP pages cachingIn the general case the website caching happens on the server side where the PHP output or database objects are cached in static files or in the server RAM. That could work pretty good but we could improve that and add one extra layer and more specifically define HTTP header for cache control so to utilize browser cache on the client side.

Procedural PHP implementation

[cc lang=”php”]
function set_headers($file, $timestamp) {
$gmt_mtime = gmdate(‘r’, $timestamp);
header(‘ETag: “‘.md5($timestamp.$file).'”‘);
header(‘Last-Modified: ‘.$gmt_mtime);
header(‘Cache-Control: public’);
if ($_SERVER[‘HTTP_IF_MODIFIED_SINCE’] == $gmt_mtime
|| str_replace(‘”‘, ”, stripslashes($_SERVER[‘HTTP_IF_NONE_MATCH’])) == md5($timestamp.$file)) {
header(‘HTTP/1.1 304 Not Modified’);
The above method receives a file path to an existing file on the server and as a second parameter the time stamp of last change of the file. Example call of the method:
[cc lang=”php”]set_headers (__FILE__, filemtime(__FILE__));[/cc]
Now let’s have a look in more details how it works. The most important bit that allows us to validate the file change is the ETagwhich stores a hash of the file and the timestamp last change so it of ensures that if the file is updated at a later point we will set different ETag to validate against.

The second HTTP header “Last-Modified” is obvious but it’s important because it tells the browser to set proper headers according the file state.

Next directive Cache-Control  public explicitly tells to the browser and any proxies transferring the request that the content could be cached and force the browser to check the ETag and Lat-Modified headers to determine file state. You can read more about Cache-Control directive and its states here.

Once we’ve set our headers next step is to check the headers from the previous request and decide what to output based on the file state. We will use the $_SERVER global variable which holds all of the data we’ll need. The first parameter is HTTP_IF_MODIFIED_SINCE which should be which should be same as last modified date of the file, second is HTTP_IF_NONE_MATCH which should match the file and time stamp hash and if both parameters are valid we can set the HTTP header to 304 – Not modified and allow the browser to server the file from the local cache instead of transferring the data again making whole experience a faster and smoother for the end user.

In case where the two conditions don’t match we simply server the request and output the original PHP content and allow this to be cached for subsequent requests.

Zend Framework cache implementation

In ZF you can take advantage of the preDispatch() method of Controllers and set the header according your own logic and conditions:
[cc lang=”php” escaped=”true”]
class TestController
public function preDispatch()
// Your logic here
$response = $this->getResponse()
$response->setHeader(‘ETag’, md5(filemtime(__FILE__).__FILE__), $replace = true);
$response->setHeader(‘Last-Modified’, gmdate(‘r’, filemtime(__FILE__)), $replace = true);
$response->setHeader(‘Cache-Control’, ‘public’, $replace = true);
if($fileNotExpired === true) {
// Set 304 header
} else {
// Serve the file
That is rough and dirty example but shows how to utilize that functionality in your scripts so it can be further adapted to your specific needs and logic.

Further reading


Tagged with: , , ,

Malformed XML breaks MagentoI had an odd white screen on one of my websites. No exceptions, no messages, it was time to check the error log.

There I found the following error:
[cc lang=”text”]PHP Fatal error: Call to a member function extend() on a non-object in httpdocs/lib/Varien/Simplexml/Config.php on line 600p[/cc]
That looked very scary like there was some mess with the core library of Magento.

Actually it turned out it’s not that bad just I had a malformed XML file under my “app/etc/modules/” directory.

Removed the file, you might need to use a bit of a trail and error here, and all worked as a charm.

Also make sure that you have there only XML files with “.xml” extension.

Tagged with: , , ,

Got problems with GIT on CentOS?I was trying to install GIT on a CentOS 6.2 development server and ran into very odd problem:
[cc lang=”text” escaped=”true”]# yum install git
Loaded plugins: fastestmirror, priorities
Loading mirror speeds from cached hostfile
* epel:
Setting up Install Process
Resolving Dependencies
–> Running transaction check
—> Package git.x86_64 0: will be installed
–> Processing Dependency: perl-Git = for package: git-
–> Processing Dependency: perl(Git) for package: git-
–> Processing Dependency: for package: git-
–> Processing Dependency: for package: git-
–> Processing Dependency: for package: git-
–> Processing Dependency: for package: git-
–> Running transaction check
—> Package compat-expat1.x86_64 0:1.95.8-8.el6 will be installed
—> Package git.x86_64 0: will be installed
–> Processing Dependency: for package: git-
—> Package openssl098e.x86_64 0:0.9.8e-17.el6.centos.2 will be installed
—> Package perl-Git.x86_64 0: will be installed
–> Processing Dependency: perl(:MODULE_COMPAT_5.8.8) for package: perl-Git-
–> Finished Dependency Resolution
Error: Package: git- (epel)
Error: Package: perl-Git- (epel)
Requires: perl(:MODULE_COMPAT_5.8.8)[/cc]
So there are these two errors, missing libcurl and compat module, which I checked and were installed and running under /usr/sbin

The other message that gets your attention is:
[cc lang=”text” escaped=”true”]There are unfinished transactions remaining. You might consider running yum-complete-transaction first to finish them.[/cc]
I tried to run that but hit another rock:
[cc lang=”text” escaped=”true”]# bash: yum-complete-transaction: command not found[/cc]
So that was logical and completely my mistake just run:
[cc lang=”text” escaped=”true”]yum install yum-utils[/cc]
OK now all was there, I cleaned the unfinished transactions but I got same error as in the beginning. And finally the solution I found is to run install with EPEL repository disabled:
[cc lang=”text” escaped=”true”]# yum install git –disablerepo=epel[/cc]
All was installed and running as a charm. Hopefully this could help you as well.

NOTE: Git and Perl dependencies breaking the yum update

To fix that issue you can add the following line to your /etc/yum.repos.d/epel.repo file to exclude those packages and tell yum not to look into EPEL repository for them:
[cc lang=”text” escaped=”true”][epel]
name=Extra Packages for Enterprise Linux 5 – $basearch
exclude=perl* git*[/cc]

Tagged with: , , ,

Drop Table with predix or wildcardMySQL is great database especially for small and mid size web applications. You can do loads of stuff with it but not everything is possible with regular SQL query.

Imagine you have a database and you want to delete set of tables. Most of the database, if not all, I create I use table prefix like tbl_users, tbl_orders, etc. This is mainly done for improving database security which could be serious problem for Open Source applications where the database scheme is publicly accessible and this could be used for attacking your website with SQL Injection techniques.  The prefix makes your tables have relatively unique or difficult to guess names but there other reasons as well.

But imagine you have a database with tables with different prefixes and you want to delete/drop just one of the sets: e.g. set1_users and set2_users.

In many other applications you can use wildcard or even Regular Expressions to match specific records but that’s not case in SQL or MySQL.  at least not for the DROP TABLE clause. So how to solve the problem?

You cannot do it with just a single MySQL command, however you can use MySQL to construct the statement for you.
[cc escaped=”true” lang=”sql”]SELECT CONCAT( ‘DROP TABLE ‘, GROUP_CONCAT(table_name) , ‘;’ ) AS statement
FROM information_schema.tables
WHERE table_name LIKE ‘set1_%’;[/cc]
This won’t delete the tables but it will produce the SQL that can be executed. Be aware that information_schema.tables includes all tables of all databases. So to extend further the SQL command we can specify database name like that:
[cc escaped=”true” lang=”sql”]SELECT CONCAT( ‘DROP TABLE ‘, GROUP_CONCAT(table_name) , ‘;’ ) AS statement
FROM information_schema.tables
WHERE table_schema = ‘database_name’ AND table_name LIKE ‘set1_%’;[/cc]
All that looks pretty good with exception of one thing, if you want to automate it you’ll need server side scripting so you can’t execute it just as SQL. The solution is using STATEMENTS which will allow you to define the SQL and execute it by passing the tables you want to delete. Here is an example:
[cc escaped=”true” lang=”sql”]SET @tables = (SELECT GROUP_CONCAT(table_name) FROM information_schema.tables WHERE table_name LIKE ‘set1_%’);
PREPARE drop_statement FROM ‘DROP TABLE @tables’;
EXECUTE drop_statement USING @tables;
DEALLOCATE PREPARE drop_statement;[/cc]
So now this looks perfect. Exactly what we needed. Imagine you want to create an automated build with Phing and you want to delete specific tables every time you run “phing build:build” so this SQL would make it possible using EXCUTE and PREPARE statements.

If you have any ideas or better solutions feel free to add a comment!

Tagged with: , , ,

Here’s a quick and dirty utility method for type checking an object or an array of objects.PHP Type checking
[cc escaped=”true” lang=”php”]/**
* Ensure the object (or the first item in the
* array of objects) is of the specified type
* or throw an exception
* @static
* @throws Exception if wrong type
* @param $object_or_array
* @param $class_name
* @return bool
public static function validate_instance_of($object_or_array, $class_name) {
if(is_array($object_or_array)) {
$to_test = $object_or_array[0];
} else {
$to_test = $object_or_array;
if(!is_null($to_test) && !($to_test instanceof $class_name)) {
throw new Exception(‘Object or array of objects expected to be of type ‘ . $class_name . ‘, ‘ . gettype($to_test) . ‘ received.’);

Tagged with: , , ,